XP Series Webinar

End-to-End Test Automation with Provar

In this XP Series webinar, you'll learn from Zac about the intricacies of 'End-to-End Test Automation with Provar'. Explore insights into efficient testing strategies for robust software solutions.

Watch Now

Listen On

applepodcastrecordingspotifyamazonmusic
Zac

Zac Taylor

Senior Solutions Engineer, Provar

Zac Taylor

Zac Taylor

Senior Solutions Engineer, Provar

Zac Taylor is a Senior Solution Engineer at Provar with nearly 7 years of experience working with software development and quality teams. He takes pride in helping small and enterprise organizations deliver quality software solutions. Zac holds dual degrees in Data Sciences and Economics with a minor in Mathematics. He resides in Atlanta, Georgia, and enjoys video games and hiking when he isn’t doing home improvement projects.

HARSHIT PAUL

Harshit Paul

Director of Product Marketing, LambdaTest

Harshit Paul serves as the Director of Product Marketing at LambdaTest, plays a pivotal role in shaping and communicating the value proposition of LambdaTest's innovative testing solutions. Harshit's leadership in product marketing ensures that LambdaTest remains at the forefront of the ever-evolving landscape of software testing, providing solutions that streamline and elevate the testing experience for the global tech community.

The full transcript

Harshit Paul (LambdaTest) - Hello, everyone, and welcome to another exciting episode of the LambdaTest XP Series. Through XP Series, we dive into the world of insights & innovation featuring renowned industry experts and business leaders in the testing and QA ecosystem. I'm Harshit Paul, your host and the Director of Product Marketing at LambdaTest, and I'm thrilled to welcome you to today's XP webinar session on End-to-End Test Automation with Provar.

And with me, we’ve Zac from Provar. Zac brings with him almost seven years of experience, not just working in software development and quality teams but being a solution provider as well. He takes immense pride in assisting organizations, both big and small, to deliver top-notch software solutions. He's not just a tech enthusiast, he's a well-rounded individual with a passion for numbers and innovation. Zac, how about you introduce yourself to the audience as well, and it's a pleasure to have you, of course.

Zac Taylor (Provar) - Yeah, wonderful. Thank you so much, Harshit, for the introduction. As Harshit mentioned, my name is Zac Taylor. I do have around seven years of experience in the industry. I've been in QA for quite a while. I've been in the Salesforce ecosystem for nearly five years now. I started out in a technical capacity and found myself in the solution engineering side, being able to deliver quality solutions to people and have those conversations while also being able to have that technical expertise. So again, thank you very much.

Harshit Paul (LambdaTest) - Thanks for being here, Zac. And with that said, let's set the stage for what lies ahead. We are going to deep dive into performing test automation with the help of Provar. And Zac is going to be the one sharing the stage from here on. So, Zac, you can probably go ahead from here on. Thank you.

Zac Taylor (Provar) - All right, thank you so much. I'll get my screen shared here, and we can go ahead and jump right into it. All right. Are the slides visible?

Harshit Paul (LambdaTest) - Yes, I can see it, that's clear.

Zac Taylor (Provar) - Wonderful. So I'm just gonna go ahead and jump right into it. Just kind of give a little conversation about who Provar is, what we've been doing in the industry, and kind of how we've got to where we are today. So who is Provar? So we've been the leader in Salesforce testing since 2014. We've been around for nearly a decade. We've been thought leaders in the industry, and we're pretty well known for being a Salesforce first testing solution.

However, we are able to reliably test full end-to-end scenarios that involve various web applications, APIs, databases, email services, and more. We do have a global presence with offices in the US, UK, and India, and we have over 250 customers worldwide.

I did want to start this off by briefly explaining some of the differences and nuances when it comes to testing Salesforce as compared to testing generic solutions. And then to kind of further expound on that, what we've noticed, the challenges we faced, how we've overcome them within the Salesforce environments, and then how we further expanded that to some more broad automation testing. To get started, some of you may or may not know, but Salesforce does do releases roughly three times per year.

These can consist of multiple UI changes. In some instances, they can be rather wide-sweeping, and you may not get a lot of information on when they're actually going to hit your sandboxes. This can be quite frustrating to quality assurance teams in particular because I like to dub it as effectively burning a candle at both ends. Not only do you have to deal with the development work that your development team is working on, but you also have to, again, deal with the changes that Salesforce is pushing out.

To help kind of combat that, Provar has introduced a metadata-driven approach to testing Salesforce. And we also have a very close relationship with Salesforce and their roadmap. So anytime Salesforce comes out with releases, we immediately come out with releases as well that minimally impact or negate any of those changes that Salesforce makes that allow your users to affect it in QA to effectively test your development work rather than Salesforce's.

Additionally, Salesforce is a very complicated and deep DOM. They have dynamically changing IDs that are changed every time the page is re-rendered. They have a lot of different page layouts based on the profiles, and permission sets that you're testing against. And this is, again, caused quite a challenge, especially when it comes to generic testing solutions. So again, leveraging that metadata-driven approach to testing, being able to enhance that further to ensure that you have a declarative approach to testing Salesforce and handling this complicated DOM.

Salesforce also introduced the Shadow DOM, their custom implementation of it. Shadow root and Shadow DOM elements have been quite difficult to penetrate. In order to help facilitate testing in this environment, we created a proprietary locator technology called Nitro X that allows us to penetrate those Shadow root and Shadow DOM elements with standard XPath notation. Additionally, there are a lot of brittle and nested frames within Salesforce. It makes it very difficult to traverse if any of you have done.

Test automation in the past, I'm sure you're aware of just how painful iframes are for auto navigation, switching into and out of them to ensure your elements are being interacted with correctly. Provar again has a declarative approach to building test cases. We'll leverage our test builder solution to, again, declaratively build out tests, auto detect things like iframes and shadow roots to again, ensure that our end users are going to have a very seamless process for building this out rather than getting bogged down with all the nuances.

Salesforce also has a very nuanced environment when it comes to environmental strategies. They offer a wide range of different testing platforms from scratch orgs that are spun up, coded, and tested deployed on and then broken down to things like sandboxes, production environments, developer environments, etc. To ensure that we don't have to duplicate our tests across different environments, we came up with a very nice way to override environments within our particular test cases. This allows us to use the same test cases, parameterize the connection, and point that toward different orgs and environments. This way, we don't have to have mirrored test cases for every single node that we're testing against.

Right. So all of this leads to all of these issues, and nuances and challenges lead to what we like to call the rework spiral. So the rework spiral is, as an example, you have a suite of automated test cases. They've been running fine. Something has caused them to break. So the process to fix that is looking at the logs, executing the test case again, maybe, to make sure the logs are correct, verifying what broke, fixing it, rerunning the test case, and then rinsing and repeating that process until you successfully get a passing.

While this works, this isn't necessarily effective. There's a lot of time, effort, and maintenance involved in continually re-executing these test cases. And as you scale, more coverage is of course, going to mean more rework on top of that. So again, what we've kind of noticed is that eventually, some of our clients will reach what I like to call critical masses within their testing, where they come to realize that they're spending more time refactoring and editing their test cases than they are getting value and results out.

And at that point, you kind of have to ask yourself, what is the point of automation if you're spending more time fixing it than running it? Some other customers that we've had have come to us have reported over an 88% test reduction and test maintenance time within testing Salesforce after adopting us in a few months. So again, lots of power behind that rework spiral; leveraging our test builder solution as a declarative approach to building test cases allows us to pause, stop, rewind, and edit on the fly in those sessions but also gives us real-time feedback that the locators and actions we're using are gonna pass. I'll show this a bit in the demonstration later, but being able to step backward, step forward, edit test cases, and re-execute has saved our clients a significant amount of time.

Harshit Paul (LambdaTest) - Yeah, this looks pretty interesting, and I'm pretty sure you know something; this might be something that you get asked pretty often, is that how can you extend these capabilities, you know, in case of custom applications, how can Provar be used in unique tech stacks or custom applications.

Zac Taylor (Provar) - Yeah, so it's a great question. So we have a number of different locator methodologies based on the objects and elements we're interacting with. I've mentioned Salesforce made metadata a couple of times in this. So within Salesforce again, we leverage that metadata API to allow us to have robust test cases that are outside of Salesforce or for highly customized components where we don't necessarily have metadata information. We've been utilizing the page object model that a number of other solutions have had.

And I'll actually just kind of jump to my next slide here to kind of talk about that, that Provar difference. So again, we've covered the main portion of that, which is metadata-driven testing. Once we get the page object models, these have been working just fine for a while, again, using that Selenium page object model approach to build out these test cases and be reliant on the DOM.

Again, we've leveraged our Nitro X Locator technology to help us penetrate Shadowroot and Shadow DOM elements. But we have further expanded that in the past year to develop a component-driven framework that's much more robust, reusable, and resilient than the standard page object model. So the benefit of being component-driven is it's much more granular than the page objects.

So again, if you have components, the same component that appears on multiple different pages, you can map that component on a singular page, and Provar and Testplotter will be able to detect that elsewhere, so you don't have to waste time remapping the same steps that you did.

Harshit Paul (LambdaTest) - That makes sense. I see that you've also mentioned Selenium with XPath, and that brings up a very commonly asked question. Is that how Provar is able to enhance the field locator strategy? How does it enable the framework to extend beyond Salesforce testing?

Zac Taylor (Provar) - Yeah, absolutely. And so again, you know, our baseline was that page object model. And again, that's effectively the gold standard. It's what a lot of other frameworks have been using. But really, to extend that is really leaning into our Nitro X and fact-based framework that is component driven. So again, being able to identify specific components that aren't tied to particular pages gives you a lot more versatility when testing, a lot less time having to map elements that you've already mapped on previous pages, and allows you to, again, extend that framework significantly.

Additionally, in fact, you do have the ability to, you know, variablize and parameterize different inputs. Really there's a, you know, the world is your oyster, so to speak, as far as what we can do to extend NitroX in that regard.

Harshit Paul (LambdaTest) - Interesting.

Zac Taylor (Provar) - Yeah, absolutely. Excellent questions, though. I am going to jump to the next slide here. And this is really just a feature slide. We've talked a lot about UI testing, how we handle the DOM, and some of the changes that we run it to. But some of the other things that we've seen in the industry that have come up quite a bit are MFA and SSO support. So again, we do provide the ability to connect to your applications via SSO. We also have the ability to handle multi-factor authentication. And if your organization needs to do both of those simultaneously for your test environments as well.

I've touched on environment switching as well. Again, while this is initially started out as being specific to Salesforce, being able to toggle across different production environments and reuse those same test cases so you don't effectively have to reinvent the wheel or, again, maintain those test cases repeatedly. Provar also offers what we call auto-navigation for apps and tabs. So again, this is an easy way to automatically navigate to a particular URL or if you have an ID of an object in Salesforce, it automatically navigates to that detail page.

So again, this is a really easy way to save your test steps in your test, test offering to ensure you don't have to say, search for an element or search for and, excuse me, search for an item in a list and then click it. We can just immediately navigate there to test the meat and potatoes of what you really want to test. Integration testing is another big one. While we are a Salesforce-first testing solution, we do understand that there are a lot of moving pieces outside of Salesforce that we also need to validate, right?

Maybe we have an external CRM that we're feeding data into our Salesforce org. We need to validate that at every stage of the process. Maybe there are some form submissions that come in that generate leads, or maybe there are some external API calls that have been iffy in our code. We provide the ability to test the full breadth of integration testing. If it has an open API endpoint, we'll be able to connect to it. And anything that renders in a modern web browser will be able to build a UI test case.

Finally, I just kind of want to touch on email testing as well. Email testing has come up quite a bit, especially for things like marketing campaigns, ensuring that those emails are hitting the appropriate recipients, as well as the appropriate subjects as well. So again, we can test that either via the UI or the API.

Harshit Paul (LambdaTest) - Yeah, that, you know, all things testing makes you want to wonder how you want to make it a part of your CI/CD pipelines, right? So how do you ensure that Roar is a part of preferred by every DevOps engineer? So how do you integrate these into CI/CD pipelines to help with version control? How does that work?

Zac Taylor (Provar) - Yeah, absolutely. So we actually have a direct integration with Git from our desktop client. So you can push, pull, commit, and collaborate as needed. We also have all of our test case files on your local machine. So if you use another version control system outside of Git, we fully support that as well.

So once you have everything checked into a version control system for collaboration, we do offer a number of different options for continuous testing. I really just like to show this slide to say that no matter what your flavor of CI/CD is Provar will be able to effectively enmesh themselves in your day-to-day operations. This is by no means a comprehensive list. These are just some of the larger players that we've seen in the industry.

Harshit Paul (LambdaTest) - They do say an image is worth a thousand words. I believe this slide is a lot of talking here for that, yes. So it goes to show the entire 360 picture that you have taken into consideration while making sure that Provar is well-netted in the CI/CD and DevOps landscape here.

Zac Taylor (Provar) - Yeah, absolutely. We really just want to ensure that no matter what flavor of CI/CD you're using, we'll be able to effectively add value. And if you're just starting your test automation journey, we have a number of in-house experts who will be willing to give advice on particular systems, as well as, again, just ensure that you get everything you need to know to get started on the right foot with test automation.

All right. So with that being said, that's the kind of the end of the slideshow. We're gonna jump into the interactive portion of the demonstration now. To just really give you a high-level overview of what I'm gonna show, I do have an end-to-end flow prepared that we'll kind of cover at the end. I did wanna take some time to show how we can handle API interactions inside and outside of Salesforce, how we grab those values, make assertions, etc, before finally ending on how we handle Nitro X component-based architecture and some of the value that adds.

So now that we've transitioned into the interactive portion of the demonstration, I really just wanted to give you a lay of the land in our application before we jump into building test cases here. So again, we do have the options for a number of different screen types. We can plug these into things like SauceLabs, BrowserStack, as well as LambdaTest, and a number of different resolutions for emulations. We also offer a number of different browser options. As we're building our test cases, this will be done in Chrome.

But once you're executing them, you can execute them in any major browser you would like. We also offer a Chrome Headless option, again, for easy execution without actually rendering the browser on your local machine. Over here is the project structure that we have. Again, there are a few options for things like templates. We can add additional snippets of code. Within that is our test project, and we're going to be focusing on that for the majority of the demonstration tonight.

But to get started, we start with a connection to anything with what we, again, effectively just dub connections, right? So whether that's a connection to your Salesforce org, to an external API, to a Gmail account, what have you, that's effectively where we're going to begin. So I have a few of these already prepped out. So I'm just going to edit them really quickly so we can kind of see what that looks like. We'll be using this demo or connection a little bit later. But really, we just choose the appropriate connection type here. Again, some connection types have additional options.

As for Salesforce, we have options for communities and portals, as well as normal connections. We also have additional options for authentication. You can use the standard credential-based passwords. We also offer Auth-based support. And as I mentioned in the slides, we do offer MFA and SSO as well. So again, very easy to do this in a declarative fashion. You just populate the username and password. You'll also notice I have some environments that will override the specified here. Again, this allows me to port my test cases from one environment to the other with a simple talk.

Additionally, we can test these connections to make sure that everything is valid. We get a nice successful connection there. And because this is a Salesforce connection, we've now begun to establish metadata information and cache it. So over here in our org browser is actually where we can interact with all of this metadata. So if I select this account object, we can see all the records associated with it. If I expand this, we can see all the metadata information as well, as well as custom fields.

Again, very easy to interact with this. We can actually interact with these elements from the org browser. We can make API calls in a declarative fashion here simply by dragging and dropping them into our test case as well. So we've got a lot of power and versatility with a metadata-driven approach to testing, especially within Salesforce. Additionally, we have some other options as well. I'll cover this generic web service really briefly. So again, this is just an external API that we're gonna be playing around with as well.

Really, this is just weather data from different cities across the world. So I've just appended that base URL as well as the authentication type so that we can go ahead and connect to that API as well. All right, so with that, let's go ahead and get started with building out a simple test case here. So I'm going to click on this new test button. We're going to name this our LWC test, as we are going to be going into the Salesforce Dreamhouse app and playing around with some Lightning Web Components.

So we'll, again, choose a test folder here. I'll actually expand this to the webinar that we're doing. We'll drop this in the Dreamhouse app just so we have everything nice and tidy. And then, we need to choose that application or choose the connection specified for the demo org. And then, finally, we just need to choose a Salesforce application to load into. Again, this is all harvested from the user metadata information here.

So again, these are going to be specific to your org and user profile. So once we have that application selected, we'll click Finish. So one of the first things you're going to notice over here is that Test Builder is now doing its thing. It's pretty appropriately named. It's what we're going to be using to build out our test cases. So let me drag this model over here.

So once we get loaded into our Salesforce org, we're gonna notice that we're automatically logged into this DreamHouse app. So again, we've already skipped a couple of steps in Salesforce as compared to other generic testing solutions that would have to log in, search from this in this list, and make that selection, whereas we're automatically logged in and we're ready to begin testing. So if I actually wanted to go over here and, say, create a context, again, we would click on this context tab. There's no need for me to map that.

Because of our metadata-driven approach to testing Salesforce, we automatically know we'll be on this page before we perform any interactions against it. So to map any elements, and this holds true for anything, whether it's inside of Salesforce, outside of Salesforce, Lightning Web Component, Node.js, or what have you, we can right-click and add to test case on those elements, and we'll pull in all of the information available for it.

Within Salesforce, we know this is a Salesforce layout. We know we're on the contact homepage. We know we're interacting with that new contact button. And we have that visual confirmation in the browser. Again, just for a sanity check there. There are also some interaction types of intelligence that we use as well. Because it's a button, nine times out of 10, we're probably going to click it. But again, very easy to change this in a declarative fashion if we wanted to, say, assert the visibility of this button for a particular user.

So again, I haven't made any modifications to this. All of that information was pulled in with a single right click and added to the test case. So now we're ready to select add and do. So once I select add and do here, we'll see the test steps populated. You'll notice that refresh in the browser is part of our auto navigation to have us land on this contacts tab before we perform those interactions. And as we can see here, we do get that real-time feedback that our test steps are working.

We do have the ability to pause, rewind, step backward, and step forward. We can also edit this test step on the fly in that same session. So again, when I was talking about the rework spiral and building, executing, fixing, rinse, and repeat, we can do all of that in the same session in a very easy fashion. So again, you do not have to set breakpoints, debug, and everything. We would just simply step backward and step forward. So I'm just going to walk through these. Again, these are pretty straightforward. Again, all of this metadata information is populated when we click that new button.

We'll select add and do it there. And again, we'll see these onscreen steps and then the nested action steps beneath it. This is the basic structure for our UI-based flows. Again, those action steps correspond to, again, that particular onscreen page.

All right, so for here, I'll just populate the required fields rather quickly. So I'll just use my last name here. And then I'll actually use content assist. So this is just a pre-built library of functions that you can reference and add to. One of the ones I use quite a bit is this unique identifier function. It just allows us to append an alphanumeric unique ID to give some data variants to our test data. So I'll select Add and Do there. We can watch that get instantiated here. And then, we can move on to our next step.

So here we have a lookup field. So we'll right-click and add to the test case there. This, we handle lookup fields in Salesforce in a very seamless map fashion. So I can tell that there is a Barton media value there. So once I select add and do, we'll not only populate this lookup field, but we'll also make a selection from it if there is an exact string match. So again, think about less moving pieces or things that are less likely to break.

In a generic testing setting, you have to populate these values, understand the list that's returned, and then make a selection from a list. Whereas Provar, you populate your text, and you set and forget, so to speak. So again, a lot of these are just other UI-based fields. I'm not going to spend too terribly much time on that. Just want to right-click and add to the test case on the Save button as well. We can also take some screenshots here if we want to see what that looks like. One other thing to call out with screenshots is that the test case is not going

Even if you don't have these options toggled, Provar will automatically take a screenshot of the browser if there is a failure in your test execution. In the QA space, a picture really is worth a thousand words, and having that visual aid for debugging is immensely helpful. So at this point, we've created a very simple contact record here. We may wanna do some data validation within this. Let's assert that we have the appropriate account name set. So again, we'll navigate to that tab. You'll notice that I clicked that manually.

Once we right-click and add a test case, again, it understands that this is a Flexi component page, so it's different than the standard metadata mapping, but we're able to handle it. Additionally, we have a number of options for assertions here. So again, we could assert field-specific error messages, we could assert the label of this particular field, and we can grab values as well as particular attributes. And the attributes I really like to show, again, visibility is a great one, right? Some users and permission sets may have fields that are visible only to them, you wanna be able to test that.

So again, a very easy series of checkboxes and radio buttons for a declarative approach to test automation. And everything we've seen here can also be parameterized. We have the ability to read from test sheets to make things data-driven and much more extendable as well. Again, for the purpose of this demonstration, we're just going to do some statically coded values. So we'll just extract and assert value here. And then I also wanted to tie, and I'll do this rather quickly when I hit Add and Do. However, there is some predictive intelligence based on fields and interactions.

So I clicked on this related tab here to effectively prove that our auto navigation will work. So once we know we need to make assertions on that details tab, we're actually intelligent enough to know that if we're not there, we need to navigate there. So again, compared to other generic testing solutions that would simply fail if the locator wasn't present, we have an underlying understanding of the architecture in Salesforce to ensure that our clients have a seamless declarative approach to it.

So again, basically, everything I've shown you at this point has been metadata-driven, and again, that's in, in relation to Salesforce. Um, I did want to really briefly look at some lightning web component mappings as well. Um, so I'm going to navigate to this property explorer tab. Um, so this entire property explorer page is a, um, lightning web component within this particular filter is another component as well. So we'll right-click and add to the test case here and kind of see what that component-based architecture framework gives us.

So again, if we look over here in Test Boater now, we're able to determine that this is a Nitro X component. It's a lightning card, meaning that these cards can have multiple different cards in the deck, so to speak, to cycle through. So again, if we had an accordion style, we would be able to determine which specific card we're interacting with. Additionally, this is the filter card here. And then, if we actually step up one more, we can see that property filter component that again encompasses the entire.

So again, it's very declarative here to handle. Again, if we wanted to edit this, we could. It's not necessary because this is, again, just a demo project. But if we wanted to use adaptive locators, again, we could specify this to a particular DOM element, or we could, again, hard code some XPath if we so choose. Again, I think the adaptive locators are much more resilient, and that's the method that I'm going to go with here. So I'll just search for San Francisco in there. We'll select add and do that to be able to populate these values.

And again, just kind of coming back to Lightning Web Components, they're generally heavily nested within Shadow Root and Shadow DOM elements, which is, again, as I mentioned in the slides, quite tricky just to be able to interact with. And I'll just choose a slider option here as well. So again, we're going to right-click on that slider and select Add to Test Case. Then we actually have a nice little UI here. So again, if we wanted it to, you know, we don't want to buy a house more than half a million dollars, again, very easy to set that in a decoder.

The whole goal for us testing Salesforce is again to build test cases the same way a manual tester would test them. That gives you more time, effort, and energy to do exploratory testing rather than refactoring and maintaining a code base.

All right. So again, just really one of the times that component-based architecture kind of shows how we handle metadata mappings within Salesforce. Uh, so I'm going to click resume a few times hereto to finish this test case when we're in the test boat, or it's, it's waiting for us to add additional test steps. So I'm just letting, as you can see here, we do have this successful report. We can see the screenshot artifacts here from when I clicked the screenshot of that Save button. There's the screenshot before, and there's the screenshot.

So we've covered UI-based flows here. I really did want to pivot and kind of talk about how we handle API interactions, both inside and outside of Salesforce. So within Salesforce here, instead of actually creating this contact, Contact via the UI, we could also make an API call to spin up that contact if that's not necessarily the meat and potatoes of what we want to test.

Within Salesforce, it's very, it's trivial. I was actually pleasantly surprised when I came to Provar from my former Selenium frameworks. Making API calls in Salesforce is as simple as dragging and dropping the object you wanna interact with into your test case. So at this point, we could write a SQL query. For those of you who are Salesforce savvy, it's SQL that's effectively SQLite.

But if we wanted to make an API creation step to create this record, we could select that option. We have a very nice option of choosing fields modal, which again is just a series of checkboxes to populate the fields that we want in the API call. At this point, again, we would populate those values, and we have that resulting object ID, and boom, we've successfully created a contact via the API and a declarative fashion. Now, external APIs require a little bit more tact. It's not quite drag and drop, but again, still pretty straightforward to do.

Again, just going to edit this connection really quickly. We have this base URL as well as that API authentication key. So again, pretty straightforward to set up that connection. And then once we kind of transitioned into this. Unfortunately, this API required the API key to be passed in every request. So I created a variable of that. And then, when we actually go into this web request here, I can pass in that API key as a variable. So we do have the ability to variableize URLs and things to make this a bit more dynamic.

And again, to just kind of make these API calls, we can go in here to the test palette, which has a number of different UI data as well as design steps. So if we wanted to, say, make another rep request against this, we could just drag this into our test case, specify the appropriate API, and then choose which REST type we want.

So I'll just remove that really quickly. So I did a couple of things with this one. So again, as I mentioned previously, this pulls weather data from particular cities. So I grabbed the weather data from my city, and I wrote it in a result variable for the Atlanta Current. I also grabbed weather data from a few months ago and wrote that into a JSON file. So Provar does have the ability to write to CSV, Excel, and JSON files, as well as read them. So again, being data-driven has provided a lot of clients with a lot more resiliency within testing.

So once I have both of those, we'll look at the different variable structures there and then do some comparisons. So this one is essentially me asserting that the status code is, in fact, set to 200. We're getting the appropriate response that we want. The next one actually does a comparison between temperatures between the two days. Of course, the temperatures aren't going to be the same today as they were a month ago. So I actually added in an expected exception there to just get us to the final portion

So again, we do have the ability to override particular failures if we're expecting them. Again, it's out-of-the-box functionality. And then finally, the last thing I did with this is I did a comparison of the data that we pulled from the API and then compared that to the weather within the JSON file. So do a quick save here and run. So, in addition to test builder modes, we also have run and debug modes. And then we also have a run under an option for remote.

So I'll run this under debug mode, just so we can see what that looks like in the report here. As we can see, we did have a failure at the very end. That was expected. I am going to click at the very end here now because we can actually go to our variables tab and view variable snapshots at different points in the test case. So I clicked at the very end, again, just so that we could view all of these variables. But again, we'll start with this Atlanta current, which is, again, this result variable from the API.

So again, we can see the status option here. So again, the status code gave us 200. We got an OK response. We only got one count, which makes sense, because we were only pulling one city. And then, within that data, we have the array of values that were returned by this API call. And so again, the way that we would effectively traverse this tree, if I go to our assertion here, so if we want to assert the status of Atlanta Current, it would be Atlanta Current status code.

So again, we're effectively following that tree hierarchy to reference those variables in our test cases. And then, the same thing happened with the JSON contents here. So these JSON contents are going to look effectively identical. We're pulling all of that data. There's just not the metadata information associated.

And if we go over here, we can actually look at these assertions. Again, we can see that these temperatures were not equal. That's expected. We had an expected exception for that. The following list was a failure. And again, this is because we're comparing data from different days. But as we can see, we were able to validate all of those particular list assertions as well. So again, pretty easy and pretty straightforward to make those API calls. Outside of Salesforce, they require a little bit more. But again, by no means, it is heavily code-based.

All right. So we've kind of covered two cases there. I really just wanted to show how we mapped UI-based flows and how we handled API responses before kind of capping it out or ending with our end-to-end scenario here. So I'll open up this Verify Lead and Send Email Test case. So there are quite a few moving parts in this one. Again, just at a high level, we're going to do a form submission at a Google Doc. This Google Doc is going to send some information to our Salesforce org on the back end to effectively create a lead.

In the same way, we would get lead generation from a website. We're effectively mimicking that in our org. Once we validate that data in the org is set and correct, we will then take that information, send an email, subscribe to that email inbox, and then validate that the email we sent has the appropriate headers, is sent to the appropriate recipients, etc. So before we do that, again, just to kind of call out some of the architecture in the test case, we are reading some data from a value sheet. So again, we're just reading this Excel spreadsheet here that I'll open rather quickly.

Again, pretty basic; it just has some contact information for a user that we're going to. And then, at this point, we actually have this test step, which is called the complete contact us form. Now, this looks like a test step, but this is actually a completely containerized test case within an existing test case. That's denoted by this particular icon here. And if we see those little down arrows there, that actually lets us know that we can reference this test case multiple times in other test cases. So again, if we wanted to call this test case again, it's as simple as dragging and dropping that completely containerized test case into our.

Additionally, if we double-click this test case, this will actually open up to the test case itself so we can view all of those test steps as well. You'll notice that we do have those parameters set. So again, very easy to feed that data-driven testing into this to populate that form. So once that form is completed, as I mentioned, we do go into Salesforce. So I've written a very simple SQL query here to grab this based on the source data, first name, and last name.

Then we use Provar's auto-navigation feature to navigate directly to this record before we perform our assertions. Finally, once we perform those assertions in Salesforce, we call this send and receive email callable test case as well. Again, passing in that dynamic data to effectively send an email. We'll subscribe to that inbox. We'll wait some time for it. And then, we'll effectively use a wait-for method to pull.

So we'll effectively search to see if there are any new messages in the inbox before performing assertions against it. And then, finally, we'll just assert that the name is in there correctly. Again, very easy to variable that as well. All right, so let's go ahead and get started here. I'm going to click this and run this via test builder. We'll just give us a moment to load here.

So again, this is the contact form that we had built out for demo purposes. It's pretty vanilla, but as you can see, it's quite quick to map out these fields. I just want to talk about things that we have page object model support for. So again, if we wanted to, or actually let me step to this test case and edit it. You can right-click and edit here. Again, we can tell that this is a page object. If we wanted to edit the particular locator, we have different locator options based on what's available in the DOM.

Again, if this was a production-level web page, we would probably look for a more specific locator. But as I mentioned, this one is quite vanilla. So once we've made those changes or any modifications we want, we can click Save and Do again before finishing out our test case. So again, we're still going to use right-click and Add to Test Case to map these fields. There's just, again, a little bit more options for what we want to set within these particular locators.

So again, the form submission is pretty straightforward. The main purpose of that was, again, really just to get that data into Salesforce so we could show that end-to-end flow. So now at this point, we've actually closed out of that connection. Now we're going to launch our Salesforce admin connection here to log into that org. You may notice that we have multiple different connections to multiple different applications in the same test case. We can also intermingle UI and API interactions within the same test case as well. So again, there really is no limit to the functionality.

So here is where we kind of combined our auto navigation features. So now we're navigating to this lead page. We're gonna perform those basic assertions based on the datasheet again before finally sending it out. So I'll click resume once more. We'll actually send that and get the process of sending that email out. And I'll drag this inbox over. So again, you can see I've done a little bit of testing here, but we should be able to see a new email being dumped in here as soon as we subscribe and send our message.

And again, just doing a little bit of weight here. When we were testing this, there were some latency issues when connecting to the org, so we just added a few weights before we sent that message. Again, now we can see that the message has come, and now we're walking through the process of polling our attempts to ensure that the message has effectively hit the inbox.

Harshit Paul (LambdaTest) - I have to say that the test scenario has been very well thought out by you and Zac. It showcases different hops and things that need to be completed here in order to do the entire end-to-end scenario. And it's amazing to see how easy it is with the test build functionality using the Provar. So pretty impressive so far.

Zac Taylor (Provar) - Yeah, wonderful. Thank you so much. So again, just kind of coming back to what we've been showing here, I'll kind of click to the very end of the test case so we can see what those snapshots are. So we did all of these email validations via the API. Again, we could have traversed the UI for this, but again, the API is much easier to set up as well as reference.

So let me just kind of expand this. Once we've established the subscription, we have a subscription name variable that we've dubbed sub. So within that sub-message here, we have that status. We can see the messages that have come back. Additionally, within that particular message, we can see the body contents, who it was sent to, who is CC'd, as well as the subject, etc. So really, at the end of this, I did a quick assertion that this sub-messages.body.bodyHTML contained the name with a unique identifier that we copy pasted. So again, just wanted to kind of call that out as different ways that we can handle this and kind of cover that full end-to-end scenario as well.

Harshit Paul (LambdaTest) - Well, that looks interesting, and quite a detailed scenario covered as easy as a breeze over there with the help of Test Builder. This makes you want to wonder, especially from my end, Zac, how people would want to run their tests over a scalable infrastructure, say, where LambdaTest could come in. So how do you integrate with LambdaTest? And could you just keep it a part of the demo as well?

Zac Taylor (Provar) - Yeah, absolutely. Once we built our test cases, the important part is running them effectively and getting those results. So based on the CI/CD slide that I showed previously, again, there's a lot of information and a lot of autonomy in the infrastructures that you'd like to connect to. But again, we have been working with LambdaTest to provide scalable infrastructureless test execution in the cloud to, again, help facilitate clients from all walks of life and different points in their automation journey to get the most value.

Again, it's really straightforward. I've gone over here to our Test Settings tab and under our Browser Providers. So we could just add a new browser provider. Again, we could give that a particular name. Then we choose the browser provider we want. For example, LambdaTest passes in the appropriate authentication methods. And then we can actually do machine-specific testing as well as browser-specific testing. So again, maybe I'm on a Mac, so I'll choose a Mac here. Again, we could choose particular browser versions as well as screen resolutions for things like as well.

We can also add in particular properties, additional information if there's anything needed, as well as environment-specific variables. So again, very easy to set this up from within the application, get those test cases running on a regularly scheduled cadence, and be able to get the results.

Harshit Paul (LambdaTest) - Perfect, that helps explain. This was especially important, probably something for me to take back to the board and try to use with this agent settings and run some tests on Provar from my end too. Thank you.

Zac Taylor (Provar) - Yeah, absolutely. And then I'll just really quickly bring up the slideshow once more. We do have some Provar resources if you guys are interested in learning a bit more about Provar. We do have a customer success portal for our clients. We also offer University of Provar, which is a self-led training course that can help you get certified in our content. We also provide a healthy community forum for our clients to interact, ask questions, and provide feedback, as well as documentation links. So if you guys are interested in that, happy to share that with us.

And then, finally, I'll just kind of leave a thank you slide. Thank you guys so much Harshit and LambdaTest for hosting. I'm very happy to be a part of this.

Harshit Paul (LambdaTest) - Thanks for joining us and taking time out of your busy schedule, Zac. It was great having you. And I for sure had a lot to learn from this episode of the XP Series. And I'm pretty sure that the audience would be feeling the same. So thank you so much for giving us an End-to-End deep dive into how Provar can help facilitate test automation from one end to the other. Thank you so much to everybody who joined us, and for more such episodes in the future. Stay tuned for the XP Series to explore the limitless possibilities in the world of technology.

Till then, Happy testing!!

Past Talks

Democratize Automation to Build Autonomy and Go-To-Market FasterDemocratize Automation to Build Autonomy and Go-To-Market Faster

In this webinar, you'll explore how democratizing automation is redefining organizational dynamics, cultivate autonomy, and go-to-market faster than ever before.

Watch Now ...
Testing AWS Applications Locally and on CI with LocalStackTesting AWS Applications Locally and on CI with LocalStack

In this XP Series webinar, Harsh Mishra, Engineer at LocalStack showcases live demostrations, advanced features, and provide highlights on how LocalStack Integrates with LambdaTest HyperExecute for Faster Test Execution.

Watch Now ...
Man Vs Machine: Finding (replicable) bugs post-releaseMan Vs Machine: Finding (replicable) bugs post-release

In this XP Webinar, you'll delve into the dynamic world of 'Man Vs Machine: Finding (replicable) bugs post-release.' Discover effective strategies and insights on navigating the evolving landscape of bug identification beyond the development phase.

Watch Now ...