Together with industry leaders and experts, the Curiosity team uncovers actionable strategies to navigate the outer loop of software delivery, streamline test data management, and elevate software quality!
Transcript & timestamps
Rich Jordan (00:01.243)
Hello everyone and welcome back to the next in the series of the Why Didn't You Test That podcast. I'm Reg Jordan, your resident host of the podcast it would seem. Joining me today we have James Walker. Say hello James.
James Walker (00:15.32)
Bye everyone, glad to be back again. Thank you, Rich.
Rich Jordan (00:18.897)
Cool. And joining us today, we have a very special guest. have Sanjay Kumar. So Sanjay is an inventor and creator of Selective Hub, testing daily, test case studio, test case hub, and Crow Path. He's on a mission to make testers and developers life easy, not working for any company, but working full time for the community. That is an awesome sentence in your LinkedIn profile. Welcome Sanjay. Welcome to the podcast. Cool.
Sanjay (00:43.339)
Thank you Rich. Hello James. Hi Rich. Hello everyone.
Rich Jordan (00:46.863)
Let's first start by congratulating you and SelectiveHub. million downloads, that is an awesome number.
Sanjay (00:54.346)
Thank you. Thank you so much with all your support.
James Walker (00:55.37)
Unbelievable. Unbelievable.
Rich Jordan (00:55.761)
That is awesome. let's get going. So first of all, know, many, many, a million people it would seem know who you are already. But for those who don't tell us a bit about yourself and a bit about Selectus Hub, how you got to where you are today.
Sanjay (01:12.865)
Yeah, sure. My name is Sanjay Kumar and I have been developing these tools for the community from quite long. It's been more than I guess now almost more than eight years now. So, Kropa, Slacktruss, Testcase Studio, Testing Daily, Now Check My Links, Auto -Test Data, Page Loans. There are many, tools out there. which like pretty much every tester slide they have touched.
My only goal is to find out the problems of these daily problems of the testers and create the solutions for them. And that's how we have been, I have been working for the community throughout my career. now this all have been possible, of course, not just with me, not just by me only, that huge community support, everyone out there have been supporting Slack to us from the day one and all these tools. So that's how it has been.
Rich Jordan (02:02.516)
Yeah.
Sanjay (02:08.705)
I live in India, beautiful city, Bangalore. I started SelectorsHub from Kanpur, is a very famous city in Uttar Pradesh. So yeah, that's how like SelectorsHub we launched in 2020. And today, like in 2024, we touched more than a million testers life downloads actually, and we have touched more than that. And we are hopeful that soon we will.
reach each and every one testers each and every testers and developers life with our solutions and we'll make sure that we make like help them in all possible ways what we can do and yes going forward we'll talk more about all these things in this podcast.
Rich Jordan (02:48.743)
That's what all is under. There can't be many testers left that you haven't touched. So obviously you've rattled off lots and lots of different kind of capabilities that you've built. You're a serial kind of inventor or creator of capabilities for basically the testing industry. How do you go about understanding the kind of the community problems and then kind of working your way through to building out a solution?
Sanjay (02:52.192)
Yes, definitely.
James Walker (02:53.804)
Mm -hmm.
Sanjay (03:16.523)
So one of the major thing which I used to follow at SlacktoSub is that we hear our users feedback at yet most highest priority. So whenever any user gives any kind of feedback, whether that's a negative feedback or anything they require, many, many innovations in SlacktoSub which we have it today. One of the major thing in SlacktoSub how it got invented, it was the criticism of the community.
or you can say the entire testing thing that you must have heard this saying that don't use an XPath plugin or selectors plugin you will lose an XPath writing skills. So that's where I got this idea that why people criticize this particular thing while this is an innovative tool which is going to save people's time. Like every tool out there in the world is basically an automation tool. Every single thing like LinkedIn is also an automation automated tool which basically is like
Rich Jordan (04:07.014)
there.
Sanjay (04:09.395)
automated the process of communicating between professionals finding the jobs like suppose for a second LinkedIn is not there what how I will find the people I will go door to door knock the door hey are you looking for this job hey I have created this tool do you want to use this tool if we don't have also every single product in this world is an automation tool so that's where I got an idea that people used to say that we don't you should not use an expert tool so that's where I got an idea of inventing slactrsof which will be a smart editor
of XPath and Selectors where you will write the XPath automatically it will show you the error message what all things so it became like now it is today we have reached at a point where people used to say in interview and in like coaching training institute people used to teach about Selectors and they say that use Selectors to learn about XPath and Selectors earlier people used to say that don't use an Xpath so this is how we hear people and then that's how like another tool got invented like TestCase Studio people were
Rich Jordan (05:00.209)
Yeah.
Sanjay (05:08.635)
struggling to find out the like problems of getting these steps, screen shot etc then testing daily then so this all are like community feedback then reading about the problems of people on LinkedIn we used to host a lot of meetups across like globe, India, everywhere pretty much like in every city we have hosted flexors of meetup through RTP through different conferences we used to go speak in like these kind of podcasts with people like you and then hosting online webinars etc so we used to
hear the people that what is the problem and dig down. I used to keep looking into the problems on LinkedIn or different forums that what is the people used to keep doing the like regular thing. What can be automated? What can be the solve a problem? What problem can be solved so that their life will be easy? That's how these all tools were invented and created and we used to keep developing day to day each day every new tool and like new things in the in those tools.
Rich Jordan (06:06.459)
That's cool. That's really interesting. What sparked a thought? And I don't know whether this is a correct thought or not, but when we go out and talk to a lot of people, a lot of kind of managers of teams, they don't necessarily understand the challenges or the problems. And so when they're asking for solutions, they're asking for, they articulate macro problems rather than actual problems to be solved.
and I was interested when you talked about talking to the community there, I'm assuming that you mean the practitioners more so than the kind of the leaders of people who are a bit disconnected. Do you see that a big is a big challenge in our industry in terms of a lot of decisions, a lot of budgets that are spent are basically within the remit of people who don't necessarily understand the challenges?
Sanjay (06:42.015)
Absolutely.
Yes, sir.
Sanjay (07:00.797)
Yes, absolutely. Like we used to see lots of like what you said just that lots of budget or you can say the money is being invested in those things, is in those areas, which is not really required, but rather than like they should invest on something else they could achieve like they could help their team and achieve their goals like easily, which they don't spend for the very obvious reasons. Sometimes people don't understand the
James Walker (07:21.783)
Mm -hmm.
Sanjay (07:27.615)
like importance of those particular tools or technology you can say rather than like sometimes they feel like okay this is the problem this is the regular thing which they have done with their experience and they think that this is how it's supposed to be done like today AI is there everybody is talking about AI I didn't want to talk that thing but yeah so it's still like many people don't want to utilize that those who are utilizing it in the right way
They are achieving their goals faster than others and they are learning new things. are doing things faster. So yes, that's very correct. And I have seen in many, many like in my experience in many companies where people are not in like spending on right things rather than they are spending on different kinds of things, which is not actually required. And I mean, that's a like natural thing. Actually, I would say nobody can do the hundred percent perfect things.
James Walker (08:03.512)
.
James Walker (08:20.92)
It's interesting, isn't it? I completely agree with you, Sanjay. I think if we go back over the past 10 years, everyone was on this kind of silver bullet approach to automation, right? And I think now we've got past that. I think what's interesting is that in the testing industry, we have such a strong community. And I think that's evident by the amount of downloads and installs you have in your products, right? And the amount of testers you have onboarded. And in other industries, I just don't think that exists.
Like we have such a tight knit community, right? Like in the development industry, I don't think I know any kind of, you know, people who are really kind of influencers quote unquote to the same level that we have in the testing community, right? And when you go to these conferences, you see a lot of the same faces giving the talks and everything. I just feel like we have a very unique industry to a certain extent, right? I don't know if you've seen that as well, Rich.
Rich Jordan (09:18.437)
do yeah I guess it's really interesting right because when when I think a lot of problems that come to testers aren't necessarily they're inherited by testers right and this is you know the grand who's who's responsible for quality idea let's not get into that right but but ultimately you have a lot of best practices you know new technologies coming along those new technologies are there to solve a problem
the interesting one that goes back a few years now, why does APIs, microservices, technology exist? It exists to create bounded context, right? To break apart an architecture. But yet, many times does that actually flow through the architecture, the design, the build, the test? How many times do you actually see that flowing through and actually, you know, almost that is the way that we do these things because that architecture is used. It doesn't, right?
When you talk to testers, too many times they come up with a test strategy that talks very generic about certain ways of doing things and totally ignores the technology to which it's being built. I'm interested to get your thoughts Sanjay in terms of that.
Sanjay (10:26.176)
50.
Sanjay (10:32.065)
Exactly like most of the time I used to see you rightly mentioned that people are like in our testing word we used to talk about like strategies building the test cases writing the test cases third 30 part or you can say the process part sometimes we stuck to like in LinkedIn also you can see most of the time you will find the debate around like debate around like any particular term or process so we I mean like in general I would say
James Walker (10:49.89)
Yeah. Yeah.
Sanjay (11:00.171)
compared to the development side or the developer side. Like in testing community, talk less about the technology, more about the process and setting these things, that thing. So that's where I feel like if we, I mean, this is my personal opinion, absolutely it can be wrong, that we should more focus, if we focus more on the solutions, finding the problems of our day -to -day routine process and avoid like always setting the process or this, I mean, like,
following the old methods of doing the testing. If we try to solve or create a solution with the technology, tools and everything, that way we might achieve the better quality of the product. We might be able to do the things in a better way. Like all the tools like under selectors of Umbrella or other tools like which have been developed like Selenium, Cypress, Playwright or any other tool like Curiosity, many tools are there out there in the industry.
So if we focus on these things like technology, that's where these tools come in and they make our life easier and a lot easier. So I believe rather than setting or definitions or always talking about like in testing conferences, most of the people you will hear like at high level, like you do this, we should do this, we should do that. I feel like we should talk more about the solutions. Like we should build this, we have built this particular thing.
We were facing our testers were doing this manually every day. So we automated this process and this is the solution. We have made it free. We have made it open source like that. So that way we can improve the life of testers. We can improve the productivity or quality of the product and whatever we are delivering, we can deliver it much more faster way. absolutely correct that saying those things what like 10 years like James mentioned that before 10 years before people were doing whatever they are doing it today sometimes.
people invest more money in like old processes rather than believing in the new tools, new technology. They have a fear, we might lose our job. We might not be able to use this particular tool or technology. What if we will implement this? So yes, that is it.
Rich Jordan (13:14.129)
I'm interested in terms of your experience Sanjay, your conversation, obviously you speak to a lot of people, right? You've just been speaking to them for a long time. How much has that conversation evolved in terms of, you know, the people you spoke to eight years ago, if you can remember that far back, you spoke to a lot of people into today, right? There is an accusation that testing has stagnated, right? It hasn't really changed. We're still trying to solve the same problems we did, you know, 10, 15 years ago.
Sanjay (13:30.155)
Yeah.
Rich Jordan (13:44.113)
You know arguably we've just been talking about that right in terms of strategies don't reflect the technologies that have been being implemented today. In your opinion you know how have those conversations changed? If they haven't changed then what do you think needs to be done? Or if they have changed what kind of good things and bad things are you seeing?
Sanjay (14:05.001)
It's a mix kind of I would say that there are a set of people if you talk those kind of set of people the conversation is still happening the same way that we should do this we should do that and like we were doing in that we were doing this process particular testing in this way we should follow that and some people talk about like innovative things or AI came so now we are implementing AI in our solutions we are we have developed this particular solution now we are using this particular framework we have redesigned our framework with this new
Rich Jordan (14:08.496)
I
Sanjay (14:34.741)
particular new tool or new technology. And then there are people who are still curious whether the past thing is good or the new thing good. So this is a kind of like mix which always happened. Difficult to say that like yes or no for this particular question that whether that conversation has been changed or not. Evolved of course, but not that much. will sometimes you will be surprised in conferences or the people you will be talking. They still talk about like those old things you will be surprised. he's still thinking that way.
So that surprise comes sometimes but not always true with each and everyone. So yes.
Rich Jordan (15:09.297)
Is that a test of skill set thing or is that a of a demographic of where more of the more where the organization they come from their context is or is that you know like you say a test of skill what I'm coming from there is almost context is important right and I think a lot of the kind of the LinkedIn conversation that you find you've got some very strong opinions right but they come from a certain context right and don't necessarily
recognize context of other people on those conversations. So you've got, you know, large financial services organizations, for example, they've got a lot of legacy mainframe, lot of technical debt in there, lot of heavy process. They are very constrained to the way certain things can work, right? On the very flip side, you've got your startups where, you know, the complexity is quite low, the opportunity to implement new ways of working is there, you know, the controls aren't necessarily where the
Sanjay (15:52.884)
Yes.
Sanjay (15:58.548)
Yeah
James Walker (16:02.22)
Yeah, low risk.
Rich Jordan (16:07.259)
know the regulated organization would have kittens you know but they are trying to have the same conversation. Do you see that? Is that a challenge?
Sanjay (16:07.284)
Yes, the complete thing has been.
Sanjay (16:18.269)
Yes, yes, yes, definitely. This is a big, challenge sometimes. And in fact, like you rightly mentioned that we, I used to see this in LinkedIn, actually, like if I should define myself, I'm not. So we see it in LinkedIn. We see it in like many conferences as well. This happened a lot. we have to face this kind of challenge where people stuck with those kinds of conversations and they try to implement their old methods and like difficult to.
Rich Jordan (16:28.337)
Excuse me!
James Walker (16:29.004)
Mm -hmm
Sanjay (16:46.325)
Like you said, like mainframe, I have worked with mainframe as well in earliest, earliest, earliest stages of my career in IBM. So they're, used to, because their system, their everything has been designed and developed in that particular technology. So difficult to like move to another thing, but now today we have so, so much flexibility of another, like different kinds of tech and technology tools and everything is there. So yes, there are like lots of conversations.
and lots of people there in LinkedIn or any social media platform you can say, not just LinkedIn and like in conferences, yes you will step with such kind of conversation and discussion where you will feel, I could have like solved this particular problem and these, whatever they are doing or whatever they are saying, this has been like very, very old thing. And in fact, like in practical in reality, actually those things are not happening in industry. I should say like these new startups.
Maybe like in MNCs in big companies where they follow lots of process where they don't want to change the things because those are stable. There it could be possible like their solution what they are talking that might be implemented and because senior peoples are there or maybe like whatever things work for whatever reason they might not want to change. But in like new companies like those who are like evolving these day startups and everywhere they are like very very fast changing like rapid like this that today might be using.
Rich Jordan (17:49.649)
Yeah.
Sanjay (18:08.382)
This framework, just after one month, they might be entirely rewritten another thing and they're doing much more faster. And that's how today's companies are becoming like Unicorn or something else or like going IPO, et cetera, just in very, very less time compared to like what old things has been done.
James Walker (18:25.696)
It's, it's, it's to me, it's like a double edged sword, right? Like I think the amount of technology there is today, like it's so easy to move fast, right? If you're developing a product, like you can go into GitHub, you can download a library. you know, you can assemble components together. You can build an app pretty quickly, right? If you want to, and you have the skills, et cetera, right? I do think just, just coming back to your original point, which I think it's, it's a lot harder now than it was before to build software and to build high quality software.
If you went back 20, 25 years ago, was all waterfall. You started at the beginning, you have your big requirements, you deliver one thing and that's it. We then went to Agile. Now there's companies out there who are trying to deploy several times a day into their cloud environments. You look at the architecture and you touched on this earlier as well. We've gone from monoliths to microservices and there's so many degrees of complexity. Was it Netflix who have something like a thousand microservices in their architecture or something?
I was just writing a blog earlier in the week or late last week on the evolution of test data management. And in that article, I was trying to talk through how test data has evolved over the years. We started off at the beginning, to your point Sanjay, in mainframes. Then we evolved to relational databases. Now we're in NoSQL and all of this good stuff out there. And now if you move past it in the SaaS world, a lot of applications don't even have the concept of a backend database.
It's just all about going through the API, pulling data in and out. And it's like, if you're a tester, you've gone from this monolith, simple architecture, waterfall delivery to, you know, delivering really often microservices, huge complexity of architecture. And there's just so much stuff out there. Like it's really hard these days actually deliver high quality software. And I don't know any organization who has it nailed down and is being a hundred percent successful, right? Like it's really, really hard to do, isn't it?
Sanjay (20:22.625)
Yes, because we are doing things like very very fast like I remember back in like 10 -12 years back when I was like in IBM we used to just post one release it used to take like almost six months and here we are every six minutes like this we are posting the releases like in fact like in selectors up millions of people are using today and we just don't take any much time like if people ask that any user asks us like I need this feature it's not working we just fix that we just like
James Walker (20:27.266)
Yeah. Yeah.
James Walker (20:36.536)
Yeah.
Sanjay (20:52.321)
at high level and everything is working fine just push it. We don't take that much time. So yes, definitely like 100 % quality of course. I mean even that time was also not possible even today not possible and even in future 100 % testing can I mean testing cannot be ever completed. So it's like just keep on doing keep on doing like so people using and delivering those those way like we can do and
James Walker (20:55.298)
Yeah.
Rich Jordan (21:14.545)
I think you just articulated something really interesting, right, in terms of newer ways of working. And I think you articulated there in terms of selectors have, right, you fell forward, right, in terms of the expectation on quality is not that everything works perfectly first time. Actually, that whole feedback loop and that whole community thing is part of the quality cycle, right, in terms of almost quality in that guise is that they can use that.
James Walker (21:14.861)
Yeah.
Rich Jordan (21:44.187)
product, they can give you fast feedback and they will get a fast fix. You know, that it's an interesting one. It was that intentional in the way that would work or is that by design? Did you almost fall into that way of working purely because that's the way modern development goes?
Sanjay (21:48.019)
Yes.
Sanjay (22:01.345)
Yes, you can say like it was designed like that way or maybe like we I have a habit of that thing that Whenever some feedback comes so sometimes it it happened that I mean we don't I don't post Frequently the new releases until there's any feedback sometimes I become like a stagnant that okay now what to develop what not to develop Nothing is there. Everything is perfect. This is like Like now now is the saturation point. So that's where I start thinking
Rich Jordan (22:09.103)
Hahaha!
Sanjay (22:30.635)
But suddenly some whenever the negative feedbacks comes whenever any user has faced any challenges, that's where again we like post something or we add any feature, new feature and that's how the post releases. So yes, it happened like that way also.
Rich Jordan (22:45.553)
I think there's an interesting dynamic change there in terms of testing, isn't it? In terms of when are you finished? When it's good enough, right? Nothing needs to be gold -plated, silver -plated, whatever you want to call it, right? And I think there is another shift change in terms of what has testing become, right? In terms of historically, we try and get 100 % coverage. Caveat there, what does 100 % coverage even mean, right? You execute...
Endless amounts of testing, endless amounts of regression, and something pops out at the end to say, yes, this will be perfect. No, that was never the case, right? And so we've evolved into actually, what is good enough? And the interesting part there in what does quality mean? What does good enough mean? Good enough means that we get to a certain point where, you know, our alpha testers, our beta testers, yeah, absolutely.
Sanjay (23:34.267)
user can use the product and they can fulfill their requirement. Yeah. And whenever we feel that, okay, this is good enough for a user to fulfill their requirement, their use cases and everything is fine. We should post the release like, because if we will keep it like with us, might like, we might keep it with us forever. I like, I will take some examples of a couple of my friends that day. So when I developed the very first extension, so one of, I used to work as a tester initially.
Rich Jordan (23:42.385)
Thanks.
Sanjay (24:04.193)
then I become developer so my friend was there and he used to always keep me saying that Raghunandan Gupta he always used to say like I have developed this extension you just create this and I will help you to push this to production or wherever in Chrome Store and I pushed I created the product and I was ready with the like pushing it to the production to the Chrome Store I asked him like can you help me with this to push it to production he said like I never push because I thought like I'm still testing my product and he's still testing his product and I'm with 1 million downloads
So it happened like many times it happened that we used to keep testing, keep testing and if you have that thought in mind. So this is like very important thing for our our tester that we should keep in mind that we cannot like complete the testing. We will always get a new use case. You think you just sleep with your product in mind and you will get some more ideas, more use cases. So whenever you feel that, OK, this product is now good enough.
James Walker (24:36.726)
Wow.
Sanjay (25:03.808)
just push it and that's why you will keep getting more feedback and that's how you improve your productivity I mean quality of the product and because not just the quality you also have to add like more and more features in your product so it becomes like more usable more like solving more problems if it is like 100 % perfect product it might not be useful for the people it might not be solving their problem people will be good to use a buggy product if that product is solving their problem if your product is 100 % perfect
Rich Jordan (25:05.361)
Yeah, really.
Rich Jordan (25:27.515)
Yeah.
Sanjay (25:33.035)
But if it is not fulfilling the requirement, doesn't make sense. So you should keep that in mind.
James Walker (25:36.914)
I it really depends on risk, right? If you're in an environment where your appetite for risk is kind of high because the cost of something going wrong is low, right? Like you with a million users, actually, if you push out bad software, the chances of that actually having a bad experience is probably quite high.
Sanjay (25:42.708)
Yep.
James Walker (26:03.0)
Like if you impact 10 % of your users, they're dissatisfied. They're probably going to leave. So actually I think testing at that level actually becomes very important. But if you're at the beginning of a, you know, you're in a startup, you're at the beginning of your journey, you know, absolutely just push the software out, get feedback as fast as you can. And that's fantastic, right? The one that comes to mind at the moment is like, I keep thinking about these guys in the international space station. Have you seen this? was, there was.
Sanjay (26:22.058)
Yes.
Rich Jordan (26:30.075)
They're stranded on me!
James Walker (26:31.404)
Yeah, there were these two people who went up there. think it's two people, astronauts, two astronauts, and they were going to be going up there for like three or four days or something. And now they've been told they're up there pretty much indefinitely because there was a few problems with the Boeing spacecraft that was meant to pick them up. And I think that just summarizes it. Think about how much testing has gone into these systems. And you just have to make the assumption that every system in the world has defects in.
Sanjay (26:35.368)
Yes, yes.
Sanjay (26:54.503)
goes there.
James Walker (27:00.684)
There is no system in the world that is defect free, So you just have to play it on risk and try and de -risk as much as possible, I guess, right?
Rich Jordan (27:09.265)
Do you think there are ways to mitigate that? And interestingly, obviously, one of our main tools is around modeling and software modeling. And therefore, a lot of the conversation we've just been having was around experimentation. And you're experimenting with getting that end capability out as soon as you can. When we talk about modeling, lot of the conversation is around, spend too much designing out.
James Walker (27:13.677)
Yeah.
Rich Jordan (27:37.019)
capabilities, writing requirements, et cetera, et cetera. These are the customers we talk to, by the way. And so they're almost in a fixed mindset of waterfall by which you need to have this defined stage and you articulate out your requirements to the nth degree, right? We know that's fallacy. We know that the requirements cannot be known until you start to get further down the line. And therefore, actually modeling becomes experimentation, right? How do you model out something so it's good enough?
So it's good enough to then go to the next stage of experimentation, be that build, be that testing out the model, whatever stage next. There are means and ways to mitigate those things. James, we talk a lot about the inner and outer loop. Do you want to jump in and talk about that a little bit in terms of mitigating, or sorry, the neglect, let's say, of some of those mitigating factors that can be faced into?
James Walker (28:31.872)
Well, I think what we've been talking about a lot recently Sanjay is a lot of organizations we speak to, especially higher up the chain. It's all about developer productivity, right? That's like the highest KPI. And when you start really focusing on developer productivity, everything else becomes skewed. You know, you're just focused on churning out as many lines as code as possible. And actually what we found is the outer loop, which is definitely where you guys fit as well.
Sanjay (28:43.636)
Yeah.
James Walker (28:59.498)
It's really, it's every single piece of the ecosystem kind of beyond the code, its requirements, its automation, its testing, its data, its environments. And what we're finding is actually organizations may focus so extensively on development KPIs that actually they're kind of missing the point and their development productivity is so low. It's because they don't have good data. They don't have good testing. They don't have good requirements and it causes havoc everywhere else, right?
So what we're finding is actually to increase quality, increase productivity, quality and everything we do in the testing space is so important, right? And that's kind of where you get to. So I think it's quite interesting. Rich, any thoughts on that?
Rich Jordan (29:43.269)
I don't know. He got me started! We'll cut this bit.
James Walker (29:49.526)
No, you're good.
Rich Jordan (29:53.233)
So I guess in terms of, know, I think that we will have with a lot of teams that we engage with and inevitably I'm assuming come to you Sanjay have failed automation attempts, right? And a lot of that is around, you know, very fragile unstable foundations when it comes to what are they actually trying to test and the actual actual implementation of the automation. Do you to kind of, you know,
Sanjay (30:17.308)
what what kind of thing yes
Rich Jordan (30:23.313)
Typical conversation in terms of organization comes to you Sanjay and says, I've heard you've got this silver bullet in Selectors Hub. How do I solve all of these systemic automation problems that I've already always had?
Sanjay (30:28.522)
Hahaha
Sanjay (30:37.408)
yes this always pretty much I guess this happened with everyone people wanted to do something else and they end up doing something else so so that's like very natural thing and then people comes to yes that you have this particular tool and we all we have all these problems can it be solved so I mean difficult to like solve each and every problem with one particular tool or any any particular tool each tool solves like one particular problem
Rich Jordan (30:38.693)
you
Sanjay (31:04.513)
So that we used to tell but yes you are very right that I have like many people I have heard and many people I have seen that those who used to come to me that okay you have this awesome tool we have that selector sub is there and our automation entire automation is breaking or we have mobile app we have website can your tool has can your tool automate each and everything I'm like what so one tool cannot like do everything that means we should also like set the like
James Walker (31:28.418)
That's crazy, isn't it? Yeah.
Sanjay (31:34.176)
higher management or whom somebody is thinking about the automation or anything 100 % cannot be done with one tool you will have to implement lots of things and process and like technology to automate or like in fact you will also have to do the manual ways to do some to cover some of the scenarios which you want to do that because not everything the use of your product by the end user is not going to happen in the same way
like your automation will do for you to use or like to test your particular product. So you should also consider those scenarios doing the manual ways as well like a human use. yes, so basically like we should cover each and everything all particular scenarios in a same way which end user is going to unnecessarily don't automate each and everything. Don't try to automate like each single thing and don't think that 100 % automation can be achieved.
or 100 % testing can be achieved by in any ways.
Rich Jordan (32:35.213)
I think is interesting in terms of be realistic about what can be done. Like you say, You need to have a good approach and that might be a collection of both test types, tools, recognizing that some of it probably is non -deterministic and therefore needs to stay manual. Do you think this expectation of automation, you can automate everything.
Sanjay (32:41.706)
Yes.
James Walker (32:42.017)
Yeah.
Rich Jordan (33:03.789)
AI role -play. Do you think that whole bandwagon has increased over time or do you think it's diminished? Why don't we learn from our mistakes?
Sanjay (33:12.545)
I mean, I used to learn a lot. I don't know about others. that's how Selectors Hub is there. Just a part. So yes, that's very true. Not everyone used to do that way. So I just missed the question. Can you just repeat, please?
James Walker (33:15.704)
That's why he has a million downloads.
Rich Jordan (33:16.145)
That's the grand view of the winning subject.
Rich Jordan (33:37.905)
Why don't you think we learn from our mistakes? Do you think that's a test of problem or do you think that's outside influences?
Sanjay (33:44.768)
That's a human problem. I mean, it's not just like testers or developers I mean this saying is like very very general like why don't we learn from our errors? So it cannot be like just we cannot just say that testers are not learning from the errors It could be anyone like developer testers business and anyone can be there But yes, we should like what we heard what we hear in our industry that
These kind of things comes a lot for the testers that we should be doing like 100 % automation Every testers would learn the coding every testers would do the coding We should we don't want the armies you must have seen that many companies are like Completely laying off the testing teams or they are converting every testers to the developers So we should come out of those kind of like immediate. I mean these kind of thoughts We should be like more realistic if you want to like make your company successful or you want to achieve something I mean just think real real thing
Don't like just go do bluff with your own company or just like if you're working with a company or someone I mean with someone like be realistic like don't just ask your team that automate 100 % do 100 % this way or that way nothing can be like achieved in a same format 100 % cannot be achieved. So if you're doing whether this is a testing or development like if I say that okay
James Walker (35:01.036)
Mm
Sanjay (35:02.035)
Now chat GPT is there, I can get my entire code from there. I don't need Stack Overflow or DevTools or whatever. No, it can't happen that way. Yes, chat GPT is there. We should use, definitely we should utilize. I also, see whenever any new tool or technology come, I see it as an automated tool or they might have automated one kind of process. So they have automated one kind of process that I was writing email earlier. I used to write today as well, but I copy paste there and make it better.
like it will correct the grammar, will make it improvise that. Today also I used to write the YouTube title. I put it there, make it better. Similarly, like if earlier I used to like look for the code, look for the solution at Stack Overflow where people were writing. Now ChatGPT, it has automated that process of like Stack Overflow where people were writing the solution for a problem. Now ChatGPT is there. So I'm utilizing that way, but I don't think like it will...
James Walker (35:33.304)
Mm -hmm. Mm -hmm.
Sanjay (35:58.869)
just replace my job it will start developing the tools it will start creating the idea okay one day chat jpt will come think automatically selectors are required xpath are required and it will automatically develop a selector sub extension it will publish in chrome store it is never going to happen so because i mean if you think it direct in a single way that we have developed the chat jpt not chat jpt has developed us so this is another tool
Yes,
So people are talking a lot about it. They might not be implementing in their own company. They still might be using the same old ways, but they will talk. They will be talking in the conferences and many, many ways just to like whatever reason. So, and means if I counter this, you see that in selectors up initially we try to implement AI, but we found like this is not a right use case to implement in our solutions. And today you see like we have a 1 million downloads or users. can say tester using it and
Rich Jordan (37:17.006)
Right.
James Walker (37:17.772)
Mm -hmm.
Sanjay (37:23.456)
We are still like last week, we last month we launched one product check my links today just within a month it got 10 ,000 active users testers are using it. There's no AI nothing is there. It's a simple plugin. So there are many problems which we can find out for the testers which can improve their productivity. We can improve their productivity even without any AI without any rocket science very simple thing we can do that we can solve problem for many millions.
James Walker (37:48.578)
Mm -hmm.
Rich Jordan (37:50.669)
I think that's really interesting, really healthy that you're saying these things, right? Because you're quite right that not all problems are best fits for AI, right? And so there are numerous other ways to better solve these problems and it's healthy that all too much you see vendors come out with AI everything, right? And are they actually solving a problem? Or actually is it just a...
Sanjay (38:05.374)
Yes, yes, of course.
Absolutely, absolutely.
Sanjay (38:16.81)
Yes, yes,
Rich Jordan (38:20.054)
It's a marketing gimmick a lot of the time,
Sanjay (38:22.877)
Many times we discuss with many people and when we talk in community as well with the leaders some Many people agree. Yes. This is like marketing gimmick Sometimes we talk about AI AI in our product that we have it just to sell it or maybe like just to raise a fund and all but Actual problems are being can be solved even without AI as well can be even better way. There are problems. are problems There are very very good problems in testing in development space
Rich Jordan (38:39.889)
Yeah.
Rich Jordan (38:45.413)
Yeah, absolutely.
Sanjay (38:52.83)
which should be solved, which can be solved, just that there should be a person to find out the problems in a right way. You can solve it, you can earn, you can make life better for the people. And the same way, like we are doing it selectors, every single day we try to find out the local problems or the like generic problems which testers are doing it manually or in whatever ways like developer or.
or whom server is doing it is we try to find out those problems we create a very simple solution for them very simple we don't use any rocket science any AI or anything just simple JavaScript 2 3 files half page of code and that's simple solve the problem so anybody can do that
Rich Jordan (39:27.443)
it.
Rich Jordan (39:36.625)
Absolutely. James, your thoughts?
James Walker (39:37.164)
Yeah, no, I agree with that. mean, to me, why I'm going to caveat this to begin with, I love AI. I think it's fantastic. And I use it every day as well, Sanjay. I'm like you, like every single email, like, let's put it through ChatGPT and let's see. But also, you know, if, if I would, an algorithm exists, I would not use AI like
Sanjay (39:46.246)
Hahaha
James Walker (40:00.248)
Why would you do that? AI is non -deterministic by nature, right? Like you don't get the same result every time. And for 99 .9 % of use cases, that is not good enough. Like you wanna have determinism. That is incredibly useful when you're creating software and you're trying to plug in an algorithm where you want to have deterministic results back out, right? Where I see AI being useful is actually when you're trying to take something that's inherently unstructured and turn it into something structured.
So we've had good use cases where it's like you take a recording transcript of a meeting, which is very random. It's all over the place. It's very unstructured textual data, right? You plugged that into an LLM and you say, give me back a model or a flow chart, which covers all the different business processes and everything that was discussed in that meeting, right? And let's try and really hold down the requirements into a model that we can then use for testing, right? And that's fantastic because that wouldn't, that's not possible using an algorithm.
But I agree, I see a lot of companies who are just, let's plug AI, let's use AI to do this, let's use AI to do this. And you're like, no, like there are algorithms that have existed for the past 30 years that can do this procedure. Why are we using AI for it? Just use the algorithm. And I really, I don't understand that. think you're right, It's a lot of buzz and that's kind of the nature of it. And also, you know what developers are like. We love experimenting, we love using new technology. And sometimes it's just, how can I apply this? Even if it's the wrong...
Sanjay (41:04.862)
Hahaha
Sanjay (41:12.074)
Thank you.
Sanjay (41:26.804)
Yep.
James Walker (41:27.22)
know, technology to apply to it, right?
Rich Jordan (41:30.385)
It's interesting this, because a lot of what you were talking, a lot of what you both were talking about, is tools, utilities for everyday productivity for developers and testers, right? But yet the message gets conflated into AI will replace everything, right? And it's like, wait a minute, there's a big difference, right? And everybody, you know, it's the same conversation from earlier on, right? the practitioners know the challenges, they know where the solutions can solve the challenges, but yet.
Sanjay (41:30.568)
Absolutely.
Rich Jordan (41:59.115)
seems to be this, I suppose it's a marketing stroke who holds the budget kind of problem in organizations, right?
Obviously you've got hell of a lot of capabilities Sanjay. guess the interesting conversation from my point of view is what are the current challenges that you are looking at solving next? Put you on the spot a little bit there. You can't have solved all of them.
Sanjay (42:23.584)
Yes, yes, of course I can't solve all of them but if there is if there will be any challenge Challenge so I don't sleep with those challenges. I immediately start building those the solutions for them and that's how like There are many people who those who used to say like I mean What is the next tool you are developing? What is the next tool you are launching? So I used to always say that if there will be problem I might be launching today itself
Rich Jordan (42:29.787)
Yeah, anyway.
James Walker (42:40.875)
I love it.
Sanjay (42:51.722)
So as of now I am not running with any challenges So pretty much everything is solved I already launched couple of tools last week Maybe like another month will launch few more So I used to like whenever there is a problem I immediately start working I can't sleep like if there is a challenge in my mind Okay this problem should be solved So I immediately like I recently like
Rich Jordan (42:54.715)
All right.
Sanjay (43:16.309)
developed and published this tool like page load time and people are using it and then there was a problem like one of our user used to ask me like can it be like this, this, this way and immediately like just before this call I was just developing that thing and like I just implementing and just will be pushing and that will be like so it's a good problem like which we have solved so like you very well said that what is the next challenge you are solving so there are always challenge you just keep talking your users
keep talking to the community, will feel the prob - you will find out the problems. You just ask them like, okay, you did the testing today. What were the challenges? Where you got stuck? And I mean, not maybe not immediately you will find a like problem to create a solution. But eventually, eventually, like once you will become habitual of finding the problems out of their talk or out of their conversation, then immediately you will, you will become a creator and you will start creating developing solutions and tools, et cetera.
James Walker (43:52.736)
Yeah.
Sanjay (44:14.516)
like that and that's how it happened with me.
Rich Jordan (44:14.737)
Do you find that they articulate the problems or they kind of tend to talk about symptoms of a problem and you need to go then go and get to the root cause?
Sanjay (44:27.979)
So both, like in sometimes they used to directly say that, okay, Sanjay, this particular tool, your tool is not solving this problem. So that this is like direct, okay, this is a feature request. Most of the times it doesn't come like that way. You have to feel with your problems. Like if I talk about this testing daily, so I was like roaming around and after lunch I was in a walk. I was just thinking that I missed that.
I missed couple of conferences and then I immediately called one of my friend, hey, did you get to know about this particular conference? These conferences recently passed. He said, no, how do you get to know about them? I used to use the LinkedIn, this platform, this platform, I follow these people. So then from there, I got an idea that there should be a platform because there are like millions of blogs every day written on medium, LinkedIn.
Rich Jordan (45:21.553)
Yeah.
Sanjay (45:24.138)
Curiosity, LambdaTest, AppliTools, are tris and tests, many companies are there, everybody is creating content for tester. Now as a tester, you cannot go and follow each and every company's website, you cannot see, follow each and every influencer. So that's where I got an idea that, and that before developing it, I talked to many people that what you do, how do you keep yourself updated with this particular thing, like with software testing, what do you do? I follow this influencer, I follow you, I follow him and etc.
James Walker (45:26.401)
Yeah.
Rich Jordan (45:50.619)
Yeah.
Sanjay (45:52.084)
So this is where like this is how we used to sometimes get the like idea of any tool. Sometimes you keep reading the post of LinkedIn like people what people are talking about what problem they are following. They sometimes you might look at the comparison of couple of tools and then you see that okay all these tools are not solving this particular problem. Let's solve this like that it happened. So it just like you get a creator mindset like when when you start and when you get mature out of those and you when you have like when you love it and when you feel the fun of.
James Walker (46:02.794)
Mm -hmm.
Sanjay (46:22.002)
So yes.
Rich Jordan (46:23.473)
I mean, it's like yoga of peace, but I'm intrigued to understand what's the duration between idea to product? How long does it actually take you? Typically.
Sanjay (46:33.952)
I don't take much time. Typically not even one week I would say yes. So I immediately start developing the product and just try to push as ASAP. So that like because I believe like if there is a solution for a small problem if your particular solution is solving five people problem in my network if my solution is solving five people problem my network is hundred people network.
Rich Jordan (46:38.737)
Really?
James Walker (46:40.908)
That's amazing.
Sanjay (47:01.696)
Out of those 100 I am solving 5 people problem. Imagine in entire world how many people are there. So this 100 become like 100 million, 1 billion. This 5 will become 5 million. So basically I am going to solve this problem for 5 million people. On an average I am thinking. So that's how I think. I immediately post. Maybe it could be buggy, maybe it could not be solving 100 % problem.
Rich Jordan (47:07.899)
Yeah, absolutely.
Sanjay (47:29.312)
But it will solve those 5 people problem. Those 5 people will tell me okay this this this should be done. So that's how. Like if I look at if you look at any tool which I have launched from day 1 and you compare now like 1 % I have launched and now they are at 100 % at that level every single tool. So that's how.
Rich Jordan (47:50.267)
That's way to do it though, isn't it? It's interesting. How do you decide? I'm assuming you probably get lots of engagement around things that people want that... How do you know whether it's going to be useful or not? I guess it becomes a volume thing, right?
James Walker (47:51.864)
Yeah, it's amazing.
Sanjay (47:52.149)
Yeah.
Sanjay (48:08.352)
Yes, so the same answer like the last one. If I feel that this problem 5 people, 4 people, even 1 person is having this problem and this solution is going to solve problem for these 5 people or 1 person. I just push that. I will create the solution for that and I will push that. I 100 % sure if this guy is facing this problem, if this person, 1 person has this problem that means millions of people are having this problem just that we don't know who are those people.
So we, through marketing or through communication or through community or whatever ways we have to find out those people that I have created the solution, take it, go enjoy. Like X -part and selectors problem, selectors hub. Even today people you will see like people will be talking don't use X -part tool, don't use selectors, they are still writing manual. Okay, fine, but I found out those millions of people, I converted them and they are using it happily.
Rich Jordan (49:05.051)
That's awesome. Conscious of time, Sanjay, thank you very much for your time today. If people want to get hold of you, want to understand more about kind of Selectors Hub or all of the other various capabilities that you've got, how do they contact you or find out more about you?
Sanjay (49:21.862)
one place selectors of .com and they will get everything.
Rich Jordan (49:25.293)
Awesome. If you could give one piece of advice to the testing community struggling with test automation, what would it be? You select this hub.
Sanjay (49:36.288)
He used like this. So I mean, yes, that's that's absolutely correct. But I hardly say no.
James Walker (49:41.644)
Hehehehe
Rich Jordan (49:43.178)
and if you've got as I say if you've got a feature enhancement Sanji will turn it around in a week.
Sanjay (49:48.89)
hahahaha yeah you just let me know that hahahaha yeah definitely yeah you just let me know that this is a problem or you just ping me anywhere and I might come up with a tool hahahaha true mean it's only point like one suggestion is that like if you find a problem just create a solution for that there are solutions don't think that you don't know this technology how you will create if you are an engineer you should be a like good Googler or now chat GPT is there so hahahaha
James Walker (49:49.253)
Just a whole new product, even beyond here.
Rich Jordan (50:01.809)
That's awesome.
James Walker (50:02.626)
That's amazing.
Sanjay (50:17.98)
Your prompt can solve your problem. So how good you put the prompt how good you put the keyword out there on Google or chat GPT you get can get the solution I'm not a good developer. I'm a chemical engineer
Rich Jordan (50:30.019)
Awesome. It sounds like you're a good developer. awesome. Once again, massive thank you to Sanjay. Really interesting insight, not just on testing, product development, let's be honest. Thank you very much. Thanks for James for co -hosting with me. Please like, subscribe to the podcast. Let's grow the community bigger than Sanjay's one million. One day we'll get there. Once again, thank you Sanjay. Big, big thank you.
Sanjay (50:32.33)
No worries.
James Walker (50:38.402)
Thank you, Sanjay.
Sanjay (50:43.136)
Thank you so much guys for having me here
Sanjay (50:55.232)
Thank you. Definitely.
James Walker (50:55.458)
Thanks guys
Sanjay (51:00.074)
Thank you so much Rich and James for having me here. Bye.
James Walker (51:00.216)
Thank you, Sanjay.
Rich Jordan (51:01.105)
Bye.
00:00 Introduction and Congratulating SelectorsHub
02:52 Understanding Community Problems and Building Solutions
06:00 Challenges in the Testing Industry
08:53 Shifting Focus from Process to Solutions
12:30 How has Testing Evolved over the Past 10 Years
20:15 Balancing Risk and Quality in Software Development
25:50 Modelling and Reaching a High Quality Standard
28:13 Failed Automation Attempts at Large Organisations
31:00 Managing Expectations and Challenges in Test Automation
35:30 The Role of AI in Test Automation: Context is Key
39:26 Solving Real Problems in Testing and Development
46:00 Empowering the Testing Community to Create Solutions
-
Sanjay Kumar has developed several tools for the testing community to solve testers' daily problems.
-
The testing industry faces challenges such as the disconnect between managers and practitioners, resistance to adopting new technologies, and a focus on process rather than solutions.
-
Community feedback and continuous improvement are crucial for developing high-quality software.
-
There is a need to balance risk and quality in software development and embrace new technologies to improve productivity and solve testers' problems. Have a realistic approach to test automation and recognize that it cannot solve all problems.
-
AI should be used in the right context and not as a replacement for all algorithms and tools.
-
Identify and solve real problems faced by testers and developers, even without AI.
-
Quickly develop and launch solutions based on user feedback.
-
Encourage the testing community to create solutions for the problems they encounter.
Inside the outer loop
November 4th, 2024
48 minutes
In this episode, Rich Jordan and James Walker are joined by special guest Sanjay Kumar, Founder & Creator of SelectorsHub.
Sanjay and Rich start the conversation by exploring the challenges in the testing industry, such as the disconnect between managers and practitioners, the resistance to adopting new technologies, and the need for a shift in focus from process to solutions. Sanjay highlights the importance of community feedback, continuous improvement, and the balance between risk and quality in software development.
Sanjay Kumar shifts the conversation towards the challenges and expectations of test automation. James and Rich highlight the importance of having a realistic approach to automation and not expecting it to solve all problems.
The team also touched on the misconception that AI can do everything, and the need to use AI in the right context. AI should be used in the right context and not as a replacement for all algorithms and tools.
Sanjay emphasized the importance of identifying and solving real problems faced by testers and developers, even without AI. He shared his approach of quickly developing and launching solutions based on user feedback. Sanjay ended the episode by encouraging the testing community to create solutions for the problems they encounter!
This episode is packed with valuable takeaways you can apply to your work and leadership journey. Listen or watch it now!
Watch the full episode to learn more!
Join Rich Jordan, James Walker and special guest Sanjay Kumar, the Founder and Creator of SelectorsHub, together, they discuss Sanjay's journey developing tools for the testing community and his mission to solve quality problems.