Blog -
Performance Testing With Selenium WebDriver: Detecting High Response Times
The Problem
Here at FloQast, we have the recently-released All Workflows Dashboard which allows users to easily see an overview progress of each workflow. This new page contains a filter that allows you to see a timeline of 2 types: Workflow and team member.
The team member and the workflow filters have multiple endpoints that request data to show in their timelines. Unfortunately, the load times between each type of filter were taking 1-2 minutes for a response.
The developers quickly identified the issues and started working on a fix, but the question we had was: “How do we identify this issue earlier in the future?” This is when we realized we needed to add a performance test to our QA Automation Framework.
Once we agreed on that, the next question was: “Where to store the results for each performance test?” I suggested using the Google Sheets API to store response times from each test into a shared Google Sheet, and running the performance test every morning automatically with Jenkins.
Google Sheets API
First I created a method for the Google API authentication that can be reused:
Note: refer to Google Sheets API blog
npm install googleapis or yarn add googleapis
import { google } from 'googleapis'; import { logger } from '../../src/lib/logger'; import { GOOGLE_SHEETS_KEYS } from '../constants'; const creds = { type: GOOGLE_SHEETS_KEYS.TYPE, project_id: GOOGLE_SHEETS_KEYS.PROJECT_ID, private_key_id: GOOGLE_SHEETS_KEYS.PRIVATE_KEY_ID, private_key: GOOGLE_SHEETS_KEYS.PRIVATE_KEY, client_email: GOOGLE_SHEETS_KEYS.CLIENT_EMAIL, client_id: GOOGLE_SHEETS_KEYS.CLIENT_ID, auth_uri: GOOGLE_SHEETS_KEYS.AUTH_URI, token_uri: GOOGLE_SHEETS_KEYS.TOKEN_URI, auth_provider_x509_cert_url: GOOGLE_SHEETS_KEYS.AUTH_PROVIDER_X509_CERT_URL, client_x509_cert_url: GOOGLE_SHEETS_KEYS.CLIENT_X509_CERT_URL }; export const GoogleAPI = { spreadsheetId: GOOGLE_SHEETS_KEYS.SPREAD_SHEET_ID, authentication: async function () { logger.info('Generating Google Api Authentication'); const auth = new google.auth.GoogleAuth({ credentials: creds, scopes: 'https://www.googleapis.com/auth/spreadsheets' }); const authClientObject = await auth.getClient(); return { authClientObject, auth }; } };
This code:
- imports the googleapis package.
- imports a logger personalized module to print in the console.
- imports a constants module with the necessary credentials for google API authentication.
- exports the GoogleAPI object for later use with the key
spreadsheetId
the spreadsheet where we store the data and anauthentication
method.
After we have the above piece of code we can now come up with something like this:
import { google } from 'googleapis'; import { logger } from '../../src/lib/logger'; import { GoogleAPI } from '../../src/api/GoogleAPI'; const spreadsheetId = GoogleAPI.spreadsheetId; /** * Data to send to google sheets * @param {array} data - multidimensional array with [data,data,data] */ export const insertPerformanceTestData = async (data) => { const { auth, authClientObject } = await GoogleAPI.authentication(); const googleSheetsInstance = google.sheets({ version: 'v4', auth: authClientObject }); logger.info('Adding performance test result data to sheet: Performance Test '); await googleSheetsInstance.spreadsheets.values.append({ auth, spreadsheetId, range: 'Performance Test!A:K', // columns A to K where data is going to be inserted. valueInputOption: 'USER_ENTERED', resource: { values: data } }); };
This code:
- imports the googleapis package.
- imports a logger personalized module to print in the console.
- imports the GoogleAPI object with the authentication method.
- exports a function that uses the sheets method to insert performance data into columns from A to K.
The Test Helpers
First I created a few helper functions that I can later use:
import { std, mean } from 'mathjs'; import { logger } from '../../src/lib/logger'; import moment from 'moment'; export let requestIterations = process.env.REQUEST_ITERATIONS || 20; export let pandoraPerformanceDataSize = process.env.PERFORMANCE_DATA_SIZE || 'average'; pandoraPerformanceDataSize = pandoraPerformanceDataSize.charAt(0).toUpperCase() + pandoraPerformanceDataSize.slice(1); function msToTime(ms) { let seconds = (ms / 1000).toFixed(1); if (seconds < 60) return `${seconds} Sec`; let minutes = (ms / (1000 * 60)).toFixed(1); if (minutes < 60) return `${minutes} Min`; let hours = (ms / (1000 * 60 * 60)).toFixed(1); if (hours < 24) return `${hours} Hrs`; let days = (ms / (1000 * 60 * 60 * 24)).toFixed(1); return `${days} Days`; } export const performanceResults = (requestTimes) => { if (!requestTimes || requestTimes.length === 0) return 0; // to avoid divide by 0 return { stdDeviation: msToTime(std(requestTimes)), mean: msToTime(mean(requestTimes)), max: msToTime(Math.max(...requestTimes)) }; }; export const performRequestIterations = async ({ times = requestIterations, callback, inParallel = true }) => { if (inParallel) { let requestTimes = []; for (let index = 1; index <= times; index++) { let time = getResponseTime(callback); requestTimes.push(time); } return Promise.all(requestTimes); } let requestTimes = []; for (let i = 1; i <= times; i++) { let time = await getResponseTime(callback); requestTimes.push(time); logger.info(`Iteration #${i} out of ${times}`); } return requestTimes; }; export const getResponseTime = async (callback) => { const startTime = Date.now(); await callback(); return Date.now() - startTime; }; const performanceDataWrapper = (requestTimes) => { const performance = performanceResults(requestTimes); const requesTimesToTime = requestTimes.map((t) => msToTime(t)); return { mean: performance.mean, max: performance.max, stdDeviation: performance.stdDeviation, date: moment().format('MMMM Do YYYY, h:mm:ss a'), times: requesTimesToTime }; }; export const assignResults = ({ requestTimes, query, results }) => { const { date, mean, max, stdDeviation, times } = performanceDataWrapper(requestTimes); const _results = [ query.component, //A query.periods, //B query.workflow, //C pandoraPerformanceDataSize, //D date, //E query.periods, //F requestIterations.toString(), //G times.toString().split(',').join(', '), //H mean,//I max,//J stdDeviation//K ]; results.push(_results); };
Because response times are in milliseconds, I added the function “msToTime” which converts milliseconds to seconds, minutes, hours, or days based on the number of milliseconds as it is shown in the function.
In performance, test is vital to calculating the Standard Deviation because it represents the stability of the application, in this case, meaning an endpoint. That’s why the function “performanceResults” was created to easily calculate the Standard Deviation, mean(or average), and the Max of the requests response times using math.js npm package.
The functions:
getResponseTime
is a function that receives a callback which is the HTTP request function that is in charge of sending a request to an endpoint and receiving data back. In this case, we only want to know the response time, we don’t want to check on the data itself.requestIterations
constant is a variable that receives a value from Jenkins indicating the number of requests that should be performed.pandoraPerformanceDataSize
is a constant variable that receives a value from Jenkins indicating the amount of data that should be used to do a performance test against.performRequestIterations
is a function that takes care of performing the amount requests desired in parallel or in sequence.performanceDataWrapper
is a function that encapsulates results to be easily used later on theassignResults
function of any other function that we added.assignResults
a function that formats the results and pushes that result to an array “passed by reference”.
Writing the Test
Now we can write the test which uses the performRequestIterations
, insertPerformanceTestData
and assignResults
methods created as helpers functions.
describe('All workflow dashboard performance test Team Member Overview', () => { let headers, workflows; let results = []; let data = { periods: 'July_2021,June_2021,August_2021', pathType: 'overview', }; beforeAll(async () => { headers = await generatePandoraPerformanceHeaders(); //authentication based on selected pandoraPerformanceDataSize workflows = await UsersAPI.getUserWorkflows(headers); }); afterAll(async () => { await insertPerformanceTestData(results); // here we insert the data to Google Sheets after test is done. }); it('team member timeline data panel cards', async () => { const _workflows = workflows.body .filter((workflow) => workflow.companyIds.length > 0) .map((workflows) => workflows.type) .toString(); const query = { ...data, workflow: _workflows, headers, component: 'Team Member Panel Card', verbose: false, }; const requestTimes = await performRequestIterations({ callback: () => BiAPI.getTeamMemberTimelineOrOverview(query), //method that makes the request. }); assignResults({ requestTimes, query, results }); }); });
The Results
After setting up Jenkins with the above test file and running it with 20 iterations we get this result:
Lessons Learned
- Performance testing is important because it represents the stability of the application.
- Calculate the Standard Deviation, Mean(or average), and Max of the response times.
- Automate the performance test in a way it can automatically be run at a specific time.
- Store the performance test results for later analysis.
To learn more about performance testing, check out:
Back to Blog