ChatGPT extends beyond a mere prompting and response platform. While you can utilize prompts for SEO assistance, its true potential is unlocked when you create a custom agent tailored to your needs. This guide will help you develop a ChatGPT agent for conducting efficient on-page SEO audits.
SEO audits are crucial for any enterprise website. To streamline these processes, I devised a method to create a personalized ChatGPT agent, which I will share with you. This agent can be customized to suit your specific requirements.
Follow these simplified steps to create your own SEO audit agent:
- Configure your ChatGPT.
- Create Cloudflare code to fetch a page’s HTML data.
- Activate your SEO audit agents.
By the end of this guide, you will have a bot that provides detailed SEO insights and actionable steps to enhance your website’s performance.
Setting Up Cloudflare Pages Worker
Cloudflare Pages workers enable your agent to gather data from websites, assessing their current SEO status. Start by creating a free account:
- Visit http://pages.dev/
- Create an account using your preferred method.
Once registered, navigate to Add > Workers and select a template. I recommend starting with the “Hello World” option for simplicity.
After deploying your worker, click “Edit code” and replace the existing code with the following:
addEventListener('fetch', event => { event.respondWith(handleRequest(event.request)); }); async function handleRequest(request) { const { searchParams } = new URL(request.url); const targetUrl = searchParams.get('url'); const userAgentName = searchParams.get('user-agent'); if (!targetUrl) { return new Response( JSON.stringify({ error: "Missing 'url' parameter" }), { status: 400, headers: { 'Content-Type': 'application/json' } } ); } const userAgents = { googlebot: 'Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/121.0.6167.184 Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)', samsung5g: 'Mozilla/5.0 (Linux; Android 13; SM-S901B) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/112.0.0.0 Mobile Safari/537.36', iphone13pmax: 'Mozilla/5.0 (iPhone14,3; U; CPU iPhone OS 15_0 like Mac OS X) AppleWebKit/602.1.50 (KHTML, like Gecko) Version/10.0 Mobile/19A346 Safari/602.1', msedge: 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/42.0.2311.135 Safari/537.36 Edge/12.246', safari: 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_2) AppleWebKit/601.3.9 (KHTML, like Gecko) Version/9.0.2 Safari/601.3.9', bingbot: 'Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm) Chrome/', chrome: 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/129.0.0.0 Safari/537.36', }; const userAgent = userAgents[userAgentName] || userAgents.chrome; const headers = { 'User-Agent': userAgent, 'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8', 'Accept-Encoding': 'gzip', 'Cache-Control': 'no-cache', 'Pragma': 'no-cache', }; try { let redirectChain = []; let currentUrl = targetUrl; let finalResponse; // Follow redirects while (true) { const response = await fetch(currentUrl, { headers, redirect: 'manual' }); // Add the current URL and status to the redirect chain only if it's not already added if (!redirectChain.length || redirectChain[redirectChain.length - 1].url !== currentUrl) { redirectChain.push({ url: currentUrl, status: response.status }); } // Check if the response is a redirect if (response.status >= 300 && response.status < 400 && response.headers.get('location')) { const redirectUrl = new URL(response.headers.get('location'), currentUrl).href; currentUrl = redirectUrl; // Follow the redirect } else { // No more redirects; capture the final response finalResponse = response; break; } } if (!finalResponse.ok) { throw new Error(`Request to ${targetUrl} failed with status code: ${finalResponse.status}`); } const html = await finalResponse.text(); // Robots.txt const domain = new URL(targetUrl).origin; const robotsTxtResponse = await fetch(`${domain}/robots.txt`, { headers }); const robotsTxt = robotsTxtResponse.ok ? await robotsTxtResponse.text() : 'robots.txt not found'; const sitemapMatches = robotsTxt.match(/Sitemap:\s*(https?:\/\/[^\s]+)/gi) || []; const sitemaps = sitemapMatches.map(sitemap => sitemap.replace('Sitemap: ', '').trim()); // Metadata const titleMatch = html.match(/]*>\s*(.*?)\s*<\/title>/i); const title = titleMatch ? titleMatch[1] : 'No Title Found'; const metaDescriptionMatch = html.match(//i); const metaDescription = metaDescriptionMatch ? metaDescriptionMatch[1] : 'No Meta Description Found'; const canonicalMatch = html.match(//i); const canonical = canonicalMatch ? canonicalMatch[1] : 'No Canonical Tag Found'; // Open Graph and Twitter Info const ogTags = { ogTitle: (html.match(//i) || [])[1] || 'No Open Graph Title', ogDescription: (html.match(//i) || [])[1] || 'No Open Graph Description', ogImage: (html.match(//i) || [])[1] || 'No Open Graph Image', }; const twitterTags = { twitterTitle: (html.match(//i) || [])[2] || 'No Twitter Title', twitterDescription: (html.match(//i) || [])[2] || 'No Twitter Description', twitterImage: (html.match(//i) || [])[2] || 'No Twitter Image', twitterCard: (html.match(//i) || [])[2] || 'No Twitter Card Type', twitterCreator: (html.match(//i) || [])[2] || 'No Twitter Creator', twitterSite: (html.match(//i) || [])[2] || 'No Twitter Site', twitterLabel1: (html.match(//i) || [])[2] || 'No Twitter Label 1', twitterData1: (html.match(//i) || [])[2] || 'No Twitter Data 1', twitterLabel2: (html.match(//i) || [])[2] || 'No Twitter Label 2', twitterData2: (html.match(//i) || [])[2] || 'No Twitter Data 2', twitterAccountId: (html.match(//i) || [])[2] || 'No Twitter Account ID', }; // Headings const headings = { h1: [...html.matchAll(/ ]*>(.*?)<\/h1>/gis)].map(match => match[1]), h2: [...html.matchAll(/
]*>(.*?)<\/h2>/gis)].map(match => match[1]), h3: [...html.matchAll(/
]*>(.*?)<\/h3>/gis)].map(match => match[1]), }; // Images const imageMatches = [...html.matchAll(/
]*src="(.*?)"[^>]*>/gi)]; const images = imageMatches.map(img => img[1]); const imagesWithoutAlt = imageMatches.filter(img => !/alt=".*?"/i.test(img[0])).length; // Links const linkMatches = [...html.matchAll(/]*href="(.*?)"[^>]*>/gi)]; const links = { internal: linkMatches.filter(link => link[1].startsWith(domain)).map(link => link[1]), external: linkMatches.filter(link => !link[1].startsWith(domain) && link[1].startsWith('http')).map(link => link[1]), }; // Schemas (JSON-LD) const schemaJSONLDMatches = [...html.matchAll(/
After entering the code, deploy your worker and copy its URL. This URL will be essential for the next steps.
Configuring Your Custom GPT
To configure your GPT, open ChatGPT and navigate to “Explore GPTs.” Alternatively, you can follow this link.
Select “+ Create” and proceed to “Configure.” Here, you can input the necessary details to set up your SEO audit agent.
Below is a recommended configuration:
Name: OnPage SEO Audit Description: Analyze SEO performance of any webpage using custom user-agents. Get detailed insights into metadata, redirect chains, Open Graph tags, Twitter Cards, sitemaps, and more. Perfect for SEO professionals and developers.
With the configuration complete, your custom ChatGPT agent is ready to perform comprehensive SEO audits, providing valuable insights and actionable recommendations.
For businesses seeking to enhance their digital presence, Cyberset offers a range of services including Search Engine Optimization, Content Marketing, and Social Media Marketing. These services are designed to optimize your online strategy and boost your brand’s visibility.