I wrote a few blog posts on using AI for testing purposes. I wanted to continue giving other practical ChatGPT uses for testers. Let’s look at updating test files and creating data in a database.
Updating Files
At a previous company, a coworker had to update thousands of XML files because our test data changed. She said it would take her a few weeks to do this and asked if I had any ideas. I said I could write some C# code to make the updates for her. It would take me a day or so I estimated, and we both agreed that would be awesome. Spending 6-12 hours to automate these changes presents a significant efficiency gain over the 80-120 hours manual adjustments would require.
I entered the following into the ChatGPT prompt:
I have over 1000 JSON files that need a bit of tweaking. Each file has an ‘Address’ element, and I need to split it into AddressLine1, AddressLine2, City, State, and Zipcode. Can you write code in Python to do this and give me instructions on how to execute it?
Now, I would rather use C#, but I chose Python in this case just to be forced to work with it, and it’s pretty simple to install and use. Here is the code that was generated:
import json
import os
from pathlib import Path
# The directory where your JSON files are stored
json_directory = Path("path/to/your/json/files")
# Function to split the address
def split_address(address):
# Assuming the address is a comma-separated string in the format:
# AddressLine1, AddressLine2, City, State, Zipcode
parts = address.split(', ')
return {
"AddressLine1": parts[0] if len(parts) > 0 else "",
"AddressLine2": parts[1] if len(parts) > 1 else "",
"City": parts[2] if len(parts) > 2 else "",
"State": parts[3] if len(parts) > 3 else "",
"Zipcode": parts[4] if len(parts) > 4 else ""
}
# Loop through all JSON files in the directory
for json_file in json_directory.glob('*.json'):
with open(json_file, 'r', encoding='utf-8') as file:
data = json.load(file)
# Check if the 'Address' element exists
if 'Address' in data:
data.update(split_address(data.pop('Address')))
# Write the modified data back to the file
with open(json_file, 'w', encoding='utf-8') as file:
json.dump(data, file, indent=4)
print("All files have been processed.")
I created a few JSON files and ran the script. It worked like a charm. It took less than 15 minutes to bring up ChatGPT, give it a prompt, create a JSON file and copy it a few times, install Python, execute the script, and validate the results.
Generating SQL Data
For another example of how testers can use ChatGPT, I entered the following into ChatGPT:
I have a page in an application that displays names in a grid with paging. I need 11 names to test the paging. The SQL table this data comes from is the Name table. It has three columns, Id, FirstName, LastName. Can you create a SQL statement that can insert 11 records in this table?
Here is the SQL it generated:
INSERT INTO Name (FirstName, LastName) VALUES
('John', 'Doe'),
('Jane', 'Doe'),
('Michael', 'Smith'),
('Emily', 'Jones'),
('Daniel', 'Brown'),
('Jessica', 'Taylor'),
('William', 'Davis'),
('Sarah', 'Wilson'),
('James', 'Johnson'),
('Linda', 'Miller'),
('David', 'Martinez');
This SQL is pretty simple to write, but giving it the table structure for a few tables and asking it to write a query that would be more complex, isn’t an issue for ChatGPT. Let’s have it insert 10,000 records for us:
DECLARE @i INT = 0;
WHILE @i < 10000
BEGIN
DECLARE @firstName --Enter anything you want for the data; I had to remove the logic because WordPress was erroring when saving.
DECLARE @lastName --Same here
INSERT INTO Name (FirstName, LastName) VALUES (@firstName, @lastName);
SET @i = @i + 1;
END
Give it any requirements for the data that you want. Join on many tables if you need to.
Conclusion
There are many practical ways ChatGPT or any other AI tool can assist testers be more productive. I like using it to create and modify data, especially for large datasets that are needed for testing. Generating code is another use. You can check out my Selenium C# Component custom GPT as well. Testers can become more efficient when using this technology.