In an era where our digital footprints grow larger by the day, the intersection of technology and human behavior has never been more crucial. Prof. Mainack Mondal is committed to making the digital world not just secure, but user-friendly.
With a Ph.D. from the Max Planck Institute for Software Systems and postdoctoral stints at the University of Chicago and Cornell Tech, Prof. Mondal brings a wealth of international experience to his work. His research, which has garnered attention at top-tier conferences and earned him a Google India faculty research award, focuses on a deceptively simple yet important question: How can we create online security systems that people will actually use?
Join us as we explore the human side of ones and zeros, and discover why understanding user behavior might just be the key to unlocking a more secure digital future.
Can you share with us your academic journey and what motivated you to specialize in incorporating human factors in security and privacy?
My fascination with security and cryptography began during my undergraduate years, rooted in my passion for mathematics. At IIT Kharagpur, I had the privilege of taking security courses and collaborating with faculty experts in the field. This foundation led me to pursue research in secure and private systems during my Ph.D.
As I dug deeper, I observed a disconnect between system developers and user behavior. Many research papers, while rigorous, tended to justify system usefulness based on features and performance rather than empirical data on user needs. This realization surfaced when I submitted my own papers, encountering questions about user requirements, system adoption, and realistic threat models.
Driven by these gaps, I sought out communities focused on understanding user behavior and data-driven system development. I recognized an opportunity to contribute meaningfully by incorporating human factors into system design. This insight has since shaped my research interests and career trajectory
How does it feel to be recognized as one of the recipients of the Early Career Award, and what significance does this recognition hold for your research and career?
Receiving the Early Career Award from the IITB Trust Lab is truly humbling. I’m deeply grateful to everyone who’s been part of this journey — my collaborators, students, and mentors. The openness and community spirit of these individuals have been crucial in shaping my work.
This recognition is meaningful to me for two key reasons. First, it validates that my research agenda is yielding valuable solutions and generating interest in the field. It’s encouraging to see that others are eager to engage with and build upon this work. Second, the IITB Trust Lab offers an invaluable platform for growth and collaboration. The prospect of visiting IIT Bombay excites me—it’s a chance to exchange ideas, expand my research horizons, and make new connections. Engaging with such a prestigious institute marks a significant milestone in my career, opening doors to fresh collaborative opportunities.
In all, this award isn’t just a personal achievement; it’s a stepping stone towards broader impact in our field.
Could you provide some insights into your current research projects focused on developing usable privacy and security mechanisms for online users?
My current research agenda centers on usable security and privacy, with a keen focus on how emerging technologies are reshaping this landscape. Take digital transactions, for instance. Not long ago, people were wary of sending money online. Now, with technologies like UPI, we’re seeing trillions of rupees transacted monthly. This shift brings new security and privacy challenges to the forefront. Someone who gains access to your phone could potentially intercept your OTP and steal your money, or easily check your bank balance without stepping foot in a bank.
Emerging technologies have also amplified issues like stalking. The numerous ways to track individuals nowadays is frankly alarming. My research addresses these concerns, particularly examining how new technologies, including AI, are creating novel privacy and security vulnerabilities across different cultural contexts.
I’m working on developing more culturally aware systems that offer enhanced protection against stalking and other privacy threats. In essence, my focus is twofold: addressing the new security challenges brought about by emerging technologies and creating user-friendly systems to tackle these challenges head-on.
What are some of the key challenges you face in your research, particularly in minimizing system abuse while ensuring usability for online services?
In my research on usable privacy and security, I’d say we face several key challenges. First, we need to understand how people are actually using emerging technologies and what problems they’re running into. This isn’t always straightforward, as user behavior can be complex and varied. The next big challenge is rigorously collecting data to get a clear picture of these privacy and security issues. Once we have that understanding, we can move on to designing, building, and deploying systems that tackle these challenges.
But our work doesn’t stop there. It’s crucial to measure whether these systems are actually useful in real-world scenarios. Are they making a tangible difference in users’ security and privacy?
To address these challenges, I often take a hands-on approach by building prototypes and proof-of-concept systems. This strategy has two main benefits. It allows us to more easily evaluate if the systems are truly usable and if they’re effectively improving security and privacy. As a researcher and computer engineer, I find it incredibly rewarding to create systems that have a real, positive impact on people’s lives.
This approach helps bridge the gap between theoretical research and practical solutions, ensuring that our work isn’t just academically interesting, but also genuinely helpful to the end users.
Are the systems and prototypes you’re working on specific to the Indian context, or do they have broader applications?
It really depends on the problem we’re tackling. One thread of my research focuses on developing culture-aware systems. Take ChatGPT, for example. Its responses might vary across cultures, so it’s crucial to build systems that can identify and address potential cultural friction.
Consider the differences in spending behaviors, privacy concerns, and transaction models between the US and India. In the US, you might primarily use a credit card, while in India, a card transaction often requires an additional OTP. These regional variations in transaction models necessitate the creation of distinct systems.
That said, some of my projects have broader applications. For instance, I’m working on improving data dashboards for platforms like Google, Facebook, or Amazon. These dashboards are designed to display privacy data, showing users what information is collected and how it’s used. This kind of transparency is universally beneficial.
When it comes to more culture-specific systems, such as those tailored for particular cultural contexts or when creating machine learning models for specific cultures, we need to consider culturally specific deployments and system designs.
So, to sum up, the specificity of our systems really depends on the problem at hand. Some are indeed tailored to specific cultural contexts like India, while others have more universal applications.
As an early-career researcher, how do you stay ahead or stay abreast of the advancements in this rapidly evolving field?
In this fast-moving field, just think about how many new apps you’ve installed in the last year. Now multiply that by about 50 million, and you start to get a sense of the scale we’re dealing with. It’s massive.
To stay on top of trends, I use a few strategies. First, I keep an eye on the papers being published. They often give us a heads up on where the field is heading. But what I find really valuable is talking to a wide range of people. For the kind of research I do, real-world problems are gold. So I chat with all sorts of folks – researchers, tech enthusiasts, and everyday users. Each group brings something unique to the table.
I connect with researchers both in India and abroad. But I also make sure to talk to regular tech users who might not be involved in research but are very aware of the issues they face day-to-day. This mix helps me spot problems and understand user behavior from different perspectives.
It’s kind of a two-pronged approach. On one hand, I’m diving into research papers and talking with other researchers – that’s the top-down bit. On the other, I’m out there talking directly with users, getting that grassroots perspective on the tech they’re using. It’s this combination that really helps me get a full picture of what’s going on in the field.
What are your aspirations for the future in this field?
This area of research is really close to my heart. I want to build a community of researchers who truly care about the human factors in designing and deploying secure, private systems. I’ve taught usable security courses before, including one in Chicago, but the number of people in this field is still relatively small, especially when you compare it to more established areas in Indian academia, like cryptography.
I see my role as helping people understand why this field matters and giving them a sense of how to approach it. There are techniques for building systems with usability in mind that can be learned. It’s important to remember that even if you’re an engineer and not specifically a designer, there are effective ways to integrate human-centered principles into your work.
A big part of this is bringing together people from different fields – Human-Computer Interaction (HCI), systems security, cryptography, system building. It’s about creating a space where these different perspectives can come together to tackle the challenges we’re facing.
Looking ahead, I’m excited about growing this community and seeing how we can make security and privacy more accessible and user-friendly for everyone. It’s not just about building secure systems; it’s about building secure systems that people can and want to use.
For those students with a technology background who are interested in learning about qualitative data and human behavior, how can they effectively start exploring this area?
I had to learn this myself, actually. My advisor wasn’t too keen on it because his expertise was in quantitative data. My training was in coding, statistical analysis, and working with medium- and large-scale data. But I realized I needed to understand the qualitative aspect to fully grasp the meaning behind my quantitative data.
So, I went back to basics. I started reading books and tried my hand at smaller-scale surveys. With our tech background, we can leverage tools like audio-to-text conversion, Python scripts, and other automation techniques. I believe taking a course to learn the techniques and then applying them to a project is really effective.
You’re obviously very passionate about your research. What other interests or areas inspire you?
I do a lot of reading – historical books, non-fiction, poetry, and I love puzzles. I also watch quite a bit of sports, especially football and cricket. I really enjoy talking to people, picking their brains about ideas, and learning about the issues they’re facing. It’s fulfilling when I can identify a problem that needs solving and actually do my best to solve it. This curiosity and problem-solving mindset extends beyond my research into my personal interests as well.
Can you share with us your academic journey and what motivated you to specialize in incorporating human factors in security and privacy?
My fascination with security and cryptography began during my undergraduate years, rooted in my passion for mathematics. At IIT Kharagpur, I had the privilege of taking security courses and collaborating with faculty experts in the field. This foundation led me to pursue research in secure and private systems during my Ph.D.
As I dug deeper, I observed a disconnect between system developers and user behavior. Many research papers, while rigorous, tended to justify system usefulness based on features and performance rather than empirical data on user needs. This realization surfaced when I submitted my own papers, encountering questions about user requirements, system adoption, and realistic threat models.
Driven by these gaps, I sought out communities focused on understanding user behavior and data-driven system development. I recognized an opportunity to contribute meaningfully by incorporating human factors into system design. This insight has since shaped my research interests and career trajectory
How does it feel to be recognized as one of the recipients of the Early Career Award, and what significance does this recognition hold for your research and career?
Receiving the Early Career Award from the IITB Trust Lab is truly humbling. I’m deeply grateful to everyone who’s been part of this journey — my collaborators, students, and mentors. The openness and community spirit of these individuals have been crucial in shaping my work.
This recognition is meaningful to me for two key reasons. First, it validates that my research agenda is yielding valuable solutions and generating interest in the field. It’s encouraging to see that others are eager to engage with and build upon this work. Second, the IITB Trust Lab offers an invaluable platform for growth and collaboration. The prospect of visiting IIT Bombay excites me—it’s a chance to exchange ideas, expand my research horizons, and make new connections. Engaging with such a prestigious institute marks a significant milestone in my career, opening doors to fresh collaborative opportunities.
In all, this award isn’t just a personal achievement; it’s a stepping stone towards broader impact in our field.
Could you provide some insights into your current research projects focused on developing usable privacy and security mechanisms for online users?
My current research agenda centers on usable security and privacy, with a keen focus on how emerging technologies are reshaping this landscape. Take digital transactions, for instance. Not long ago, people were wary of sending money online. Now, with technologies like UPI, we’re seeing trillions of rupees transacted monthly. This shift brings new security and privacy challenges to the forefront. Someone who gains access to your phone could potentially intercept your OTP and steal your money, or easily check your bank balance without stepping foot in a bank.
Emerging technologies have also amplified issues like stalking. The numerous ways to track individuals nowadays is frankly alarming. My research addresses these concerns, particularly examining how new technologies, including AI, are creating novel privacy and security vulnerabilities across different cultural contexts.
I’m working on developing more culturally aware systems that offer enhanced protection against stalking and other privacy threats. In essence, my focus is twofold: addressing the new security challenges brought about by emerging technologies and creating user-friendly systems to tackle these challenges head-on.
What are some of the key challenges you face in your research, particularly in minimizing system abuse while ensuring usability for online services?
In my research on usable privacy and security, I’d say we face several key challenges. First, we need to understand how people are actually using emerging technologies and what problems they’re running into. This isn’t always straightforward, as user behavior can be complex and varied. The next big challenge is rigorously collecting data to get a clear picture of these privacy and security issues. Once we have that understanding, we can move on to designing, building, and deploying systems that tackle these challenges.
But our work doesn’t stop there. It’s crucial to measure whether these systems are actually useful in real-world scenarios. Are they making a tangible difference in users’ security and privacy?
To address these challenges, I often take a hands-on approach by building prototypes and proof-of-concept systems. This strategy has two main benefits. It allows us to more easily evaluate if the systems are truly usable and if they’re effectively improving security and privacy. As a researcher and computer engineer, I find it incredibly rewarding to create systems that have a real, positive impact on people’s lives.
This approach helps bridge the gap between theoretical research and practical solutions, ensuring that our work isn’t just academically interesting, but also genuinely helpful to the end users.
Are the systems and prototypes you’re working on specific to the Indian context, or do they have broader applications?
It really depends on the problem we’re tackling. One thread of my research focuses on developing culture-aware systems. Take ChatGPT, for example. Its responses might vary across cultures, so it’s crucial to build systems that can identify and address potential cultural friction.
Consider the differences in spending behaviors, privacy concerns, and transaction models between the US and India. In the US, you might primarily use a credit card, while in India, a card transaction often requires an additional OTP. These regional variations in transaction models necessitate the creation of distinct systems.
That said, some of my projects have broader applications. For instance, I’m working on improving data dashboards for platforms like Google, Facebook, or Amazon. These dashboards are designed to display privacy data, showing users what information is collected and how it’s used. This kind of transparency is universally beneficial.
When it comes to more culture-specific systems, such as those tailored for particular cultural contexts or when creating machine learning models for specific cultures, we need to consider culturally specific deployments and system designs.
So, to sum up, the specificity of our systems really depends on the problem at hand. Some are indeed tailored to specific cultural contexts like India, while others have more universal applications.
As an early-career researcher, how do you stay ahead or stay abreast of the advancements in this rapidly evolving field?
In this fast-moving field, just think about how many new apps you’ve installed in the last year. Now multiply that by about 50 million, and you start to get a sense of the scale we’re dealing with. It’s massive.
To stay on top of trends, I use a few strategies. First, I keep an eye on the papers being published. They often give us a heads up on where the field is heading. But what I find really valuable is talking to a wide range of people. For the kind of research I do, real-world problems are gold. So I chat with all sorts of folks – researchers, tech enthusiasts, and everyday users. Each group brings something unique to the table.
I connect with researchers both in India and abroad. But I also make sure to talk to regular tech users who might not be involved in research but are very aware of the issues they face day-to-day. This mix helps me spot problems and understand user behavior from different perspectives.
It’s kind of a two-pronged approach. On one hand, I’m diving into research papers and talking with other researchers – that’s the top-down bit. On the other, I’m out there talking directly with users, getting that grassroots perspective on the tech they’re using. It’s this combination that really helps me get a full picture of what’s going on in the field.
What are your aspirations for the future in this field?
This area of research is really close to my heart. I want to build a community of researchers who truly care about the human factors in designing and deploying secure, private systems. I’ve taught usable security courses before, including one in Chicago, but the number of people in this field is still relatively small, especially when you compare it to more established areas in Indian academia, like cryptography.
I see my role as helping people understand why this field matters and giving them a sense of how to approach it. There are techniques for building systems with usability in mind that can be learned. It’s important to remember that even if you’re an engineer and not specifically a designer, there are effective ways to integrate human-centered principles into your work.
A big part of this is bringing together people from different fields – Human-Computer Interaction (HCI), systems security, cryptography, system building. It’s about creating a space where these different perspectives can come together to tackle the challenges we’re facing.
Looking ahead, I’m excited about growing this community and seeing how we can make security and privacy more accessible and user-friendly for everyone. It’s not just about building secure systems; it’s about building secure systems that people can and want to use.
For those students with a technology background who are interested in learning about qualitative data and human behavior, how can they effectively start exploring this area?
I had to learn this myself, actually. My advisor wasn’t too keen on it because his expertise was in quantitative data. My training was in coding, statistical analysis, and working with medium- and large-scale data. But I realized I needed to understand the qualitative aspect to fully grasp the meaning behind my quantitative data.
So, I went back to basics. I started reading books and tried my hand at smaller-scale surveys. With our tech background, we can leverage tools like audio-to-text conversion, Python scripts, and other automation techniques. I believe taking a course to learn the techniques and then applying them to a project is really effective.
You’re obviously very passionate about your research. What other interests or areas inspire you?
I do a lot of reading – historical books, non-fiction, poetry, and I love puzzles. I also watch quite a bit of sports, especially football and cricket. I really enjoy talking to people, picking their brains about ideas, and learning about the issues they’re facing. It’s fulfilling when I can identify a problem that needs solving and actually do my best to solve it. This curiosity and problem-solving mindset extends beyond my research into my personal interests as well.