Technical Glossary

Comprehensive definitions of press release, SEO, and AI terms. From basic PR fundamentals to advanced AI concepts—everything explained in plain English.

📰 Press Release Fundamentals

Press Release

A formal written announcement distributed to media outlets, news platforms, and journalists to publicize newsworthy events, product launches, company milestones, or important updates. Press releases follow a standardized format with headline, dateline, body, boilerplate, and contact information. They serve dual purposes: attracting media coverage and building SEO authority through indexed, structured content.

Think of a press release as a business's official megaphone—it amplifies important news to journalists, search engines, and AI systems simultaneously.

Boilerplate

The standard "About [Company]" paragraph that appears at the end of every press release, providing consistent company background information. Boilerplates typically include founding date, mission statement, key products/services, notable achievements, and company location. This reusable paragraph establishes company identity and context for readers unfamiliar with the business.

Distribution / PR Distribution

The process of sending press releases to targeted media outlets, news platforms, journalists, and syndication networks. Modern distribution includes traditional newswires, direct journalist outreach, SEO-optimized publication on indexed platforms, and integration with AI search systems. Effective distribution maximizes visibility across human readers and machine indexers.

Embargo

A "do not publish before [date/time]" restriction placed on press releases to give journalists advance notice while controlling publication timing. Embargoes allow reporters time to research, interview sources, and prepare quality coverage while ensuring coordinated announcement timing across multiple outlets. Breaking an embargo damages journalist relationships and future media coverage.

Media Kit / Press Kit

A collection of company assets provided to journalists to support press coverage, including high-resolution logos, product images, executive headshots, fact sheets, company timeline, and previous press releases. Media kits streamline journalist workflows by providing ready-to-use, publication-quality materials.

News Hook

The compelling angle or timely element that makes a press release newsworthy and interesting to journalists and readers. Strong news hooks tie company announcements to trending topics, industry challenges, consumer pain points, or current events. Without a clear hook, press releases get ignored by media and readers.

Newswire

Traditional press release distribution networks that syndicate announcements to media outlets, news aggregators, and journalists. Major newswires include PR Newswire, Business Wire, and GlobeNewswire. While expensive ($500-5,000 per release), newswires provide broad reach but lack modern SEO optimization and AI integration that specialized platforms now offer.

Pitch

A personalized email or message sent to journalists proposing a story angle or offering exclusive access to company news. Pitches are shorter and more conversational than press releases, tailored to individual reporters' beats and interests. Effective pitches highlight why the story matters to the journalist's specific audience.

🔍 SEO & Discovery

Alt Text

Descriptive text added to image HTML tags that describes image content for screen readers (accessibility) and search engines (SEO). Alt text ensures visually impaired users understand image content and helps search engines index images properly. Well-written alt text improves both user experience and search rankings.

Canonical URLs

The preferred version of a webpage URL specified via canonical tags to prevent duplicate content penalties and consolidate SEO authority. When the same content exists at multiple URLs, canonical tags tell search engines which version to index and rank, preventing your own pages from competing against each other.

Core Web Vitals

Google's metrics for page experience including Largest Contentful Paint (LCP - loading speed), First Input Delay (FID - interactivity), and Cumulative Layout Shift (CLS - visual stability). These metrics became official ranking factors in 2021. Fast, responsive, stable pages rank higher than slow, janky ones.

Crawling

The process by which search engine bots (like Googlebot) discover and scan web pages by following links from page to page. Crawling precedes indexing—if your content isn't crawled, it can't be indexed or ranked. Robots.txt files and sitemap.xml files help control crawling behavior.

Domain Authority (DA)

A metric (0-100) developed by Moz that predicts how well a website will rank in search results. DA considers backlink profile, linking domain count, and other factors. High-DA sites (70+) like major news outlets pass more SEO value through their links than low-DA sites. Building backlinks from high-DA sources significantly improves rankings.

Google Indexed

Content that has been crawled, analyzed, and stored in Google's search database, making it discoverable through search queries. Indexing is essential for visibility—unindexed content cannot appear in search results regardless of quality. Use Google Search Console to monitor indexing status.

Keywords

Words and phrases that users type into search engines when looking for information. Effective SEO requires identifying high-volume, relevant keywords and naturally incorporating them into content, headlines, and metadata. Keyword research reveals what your audience is actually searching for, not what you think they're searching for.

Meta Description

The 150-160 character summary that appears below page titles in search results. While not a direct ranking factor, compelling meta descriptions significantly improve click-through rates by telling searchers what they'll find on your page. Think of it as your page's sales pitch in search results.

Open Graph Tags

Meta tags that control how content appears when shared on social media platforms like Facebook, LinkedIn, and Twitter. Open Graph tags specify the title, description, image, and URL displayed in social previews. Without proper OG tags, social shares show generic, unappealing previews that reduce engagement.

301 Redirect

A permanent redirect from one URL to another that passes 90-99% of link equity to the new URL. Use 301 redirects when moving pages, changing URL structure, or consolidating duplicate content. Properly implemented 301s preserve SEO value during site migrations or URL changes.

Rich Snippets

Enhanced search results that include additional information like star ratings, images, publication dates, or structured data, displayed above regular search listings. Rich snippets increase click-through rates by providing more context and visual appeal in search results. They require proper Schema.org markup to trigger.

Robots.txt

A text file placed at a website's root directory (yoursite.com/robots.txt) that tells search engine crawlers which pages or sections they should or shouldn't crawl. Robots.txt helps manage crawl budget and prevent indexing of admin pages, duplicate content, or staging sites.

SEO Optimized

Content structured and formatted to rank highly in search engine results through keyword optimization, meta tags, structured data, fast loading, mobile responsiveness, and quality backlinks. SEO optimization involves both on-page elements (content, HTML) and off-page factors (backlinks, authority signals).

Sitemap (XML Sitemap)

An XML file listing all important pages on a website to help search engines discover and crawl content efficiently. Sitemaps include priority levels, update frequency, and last modification dates. Submitting sitemaps to Google Search Console speeds up indexing of new content.

Twitter Cards (X Cards)

Meta tags that control how links appear when shared on Twitter/X, similar to Open Graph tags for other platforms. Twitter Cards enable rich media attachments, including images, videos, and article summaries. Without Twitter Card tags, links show as plain text without visual appeal.

🎯 AI Optimization (AEO/GEO/LLMO)

AEO (Answer Engine Optimization)

The practice of optimizing content to provide direct, concise answers to specific questions, targeting AI assistants (ChatGPT, Siri, Alexa) and featured snippets. AEO structures content with clear answers in the first 2-3 sentences, uses FAQ sections with Schema.org FAQPage markup, and creates answer-focused content blocks optimized for voice search queries. While SEO aims to "rank on Page 1," AEO aims to "be the answer."

SEO gets you on the list of 10 blue links. AEO makes you THE answer when someone asks "What is the best plant delivery service in Ireland?"

GEO (Generative Engine Optimization)

The practice of optimizing content to be cited as a source in AI-generated summaries and responses. When ChatGPT, Perplexity, or Google AI Overviews generates an answer, they typically cite 3-5 sources—GEO is about becoming one of those sources. This requires providing authoritative, fact-dense content with clear attribution, statistics, verifiable facts, and Schema.org structured data. 40% of searches now happen in AI assistants, making GEO essential for brand visibility.

AEO provides the answer. GEO gets you credited as the authoritative source behind that answer.

LLMO (Large Language Model Optimization)

The practice of ensuring large language models accurately interpret, understand, and contextualize your content when processing it. LLMs don't "read" like humans—they tokenize, embed, and contextualize text based on learned patterns. LLMO involves using semantic HTML5 markup, clear unambiguous language, contextual clues (definitions, examples), hierarchical information structure, and consistent Schema.org vocabulary. Poor LLMO leads to misinterpretation; good LLMO ensures AI understands your content accurately.

GEO gets you cited. LLMO ensures the AI doesn't misquote you or misunderstand what your company actually does.

The Five-Layer Optimization Stack

The complete content optimization lifecycle for the AI era, consisting of five complementary layers: SEO (search engine ranking), AEO (direct answers), GEO (AI citations), LLMO (AI interpretation), and ADP (AI discoverability). Each layer addresses a different aspect of how content gets discovered, interpreted, and cited by both traditional search engines and AI systems. Pressonify.ai implements all five layers automatically for every press release.

AI Overviews (Google SGE)

Google's AI-generated summaries that appear at the top of search results, synthesizing information from multiple sources to answer queries directly. Originally called Search Generative Experience (SGE), AI Overviews represent Google's shift from showing link lists to providing AI-generated answers. Content optimized with GEO and structured data has higher chances of being cited in these summaries.

Zero-Click Search

Search queries where users get their answer directly in the search results without clicking through to any website. Featured snippets, knowledge panels, and AI Overviews all contribute to zero-click searches. While this reduces website traffic, being the cited source in zero-click results provides significant brand visibility and authority. AEO specifically targets zero-click visibility.

🔗 AI Discovery Protocol (ADP)

ADP (AI Discovery Protocol)

An open standard (MIT License) for making websites structurally discoverable by AI systems like ChatGPT, Claude, Perplexity, and Gemini. Unlike traditional SEO built for keyword-based search, ADP provides structured, machine-readable metadata specifically designed for AI reasoning engines. The protocol consists of three core files: /ai-discovery.json (meta-index entry point), /knowledge-graph.json (entity catalog), and /llms.txt (AI-readable context). ADP enables AI crawlers to discover 100% of entities on a site versus the ~60% coverage of traditional crawling.

If traditional SEO is like scattering breadcrumbs for Google to find, ADP is like handing AI systems a complete map of your entire content ecosystem.

ai-discovery.json

The canonical entry point file for the AI Discovery Protocol, located at /ai-discovery.json. This meta-index maps all AI-optimized resources on a website, including links to the knowledge graph, llms.txt, and other discovery endpoints. AI systems hit this single file first, then discover all other resources from the index—eliminating the need to crawl hundreds of HTML pages to reconstruct entity relationships.

knowledge-graph.json

A JSON-LD file containing the complete entity catalog for a website using Schema.org vocabulary. Located at /knowledge-graph.json, it provides AI systems with all entities (Organizations, NewsArticles, Products, Persons) and their relationships in a single structured file. Unlike scattered JSON-LD across HTML pages, knowledge-graph.json gives AI complete entity coverage with proper relationship mapping.

llms.txt

A human-readable markdown file at /llms.txt that provides comprehensive context about a website for AI language models. Originally proposed by Jeremy Howard, llms.txt describes the platform's purpose, key pages, features, technical details, and recent content—everything an AI assistant needs to accurately describe and recommend the service. It's the "about page" optimized for machines rather than humans.

ADP Compliance

The state of a website implementing the AI Discovery Protocol specification. ADP compliance has three levels: Level 1 (Minimal) requires only ai-discovery.json (15 minutes to implement); Level 2 (Standard) adds knowledge-graph.json and llms.txt (2-4 hours); Level 3 (Advanced) adds versioning and change logs for incremental crawling (1-2 days). Pressonify.ai implements Level 3 compliance for all published content.

X-ADP-Version Header

An HTTP response header that indicates which version of the AI Discovery Protocol a server implements. For example, X-ADP-Version: 2.1 signals ADP v2.1 compliance to AI crawlers. This header allows AI systems to understand the protocol capabilities and parse responses accordingly.

X-LLM-Optimized Header

An HTTP response header (X-LLM-Optimized: true) that signals to AI crawlers that content has been specifically optimized for large language model consumption. This includes structured data, semantic markup, and machine-readable metadata. AI systems may prioritize LLM-optimized content for citation.

IndexNow

A protocol that allows websites to instantly notify search engines (Bing, Yandex, and partners) when content is created, updated, or deleted. Instead of waiting for crawlers to discover changes, IndexNow pushes notifications proactively. Pressonify auto-pings IndexNow when press releases are published, ensuring near-instant indexing for AI and traditional search engines.

👁️ AI Visibility & Tracking

AI Visibility Engine

A system for tracking and measuring AI crawler engagement with website content. The AI Visibility Engine identifies when AI bots (GPTBot, ClaudeBot, PerplexityBot, etc.) visit pages, logs their activity, and calculates visibility scores. This provides actionable insights into how discoverable your content is to AI systems—the AI equivalent of Google Analytics for traditional search.

Traditional analytics track human visitors. The AI Visibility Engine tracks robot visitors—and in 2025, robots drive 40% of discovery.

AI Attention Score

A proprietary metric (0-100) that measures how much attention AI systems are paying to specific content. The score combines bot visit frequency, crawler type weighting (GPTBot visits are weighted differently than Googlebot), path depth analysis, temporal decay, and LLM referral attribution. Higher AI Attention Scores correlate with increased likelihood of AI citation.

LLM Referral

Traffic that arrives at your website after being cited by an AI assistant. When ChatGPT, Claude, or Perplexity cite your content and users click the citation link, the HTTP Referer header identifies the AI source. LLM Referral tracking measures how effectively your content is being cited and driving traffic from AI systems—the new "backlink" of the AI era.

Bot Tracker Middleware

Server-side middleware that intercepts HTTP requests and identifies AI crawler visits by analyzing User-Agent strings. The Bot Tracker distinguishes between GPTBot (OpenAI), ClaudeBot (Anthropic), PerplexityBot, GoogleBot, BingBot, and other crawlers, logging each visit with timestamp, path, and crawler type for analytics.

AI Crawler

Automated bots operated by AI companies to index web content for their language models and search features. Major AI crawlers include: GPTBot (OpenAI/ChatGPT), ClaudeBot (Anthropic/Claude), PerplexityBot (Perplexity AI), Google-Extended (Gemini), Applebot-Extended (Apple Intelligence), and Meta-ExternalAgent (Meta AI). Websites can allow or block specific crawlers via robots.txt.

MCP (Model Context Protocol)

An open protocol developed by Anthropic that enables AI assistants to securely connect to external data sources and tools. MCP provides a standardized way for AI systems to access APIs, databases, and services while maintaining security boundaries. For AI Visibility, MCP servers can expose bot analytics, discovery metrics, and content management tools directly to AI assistants for programmatic access.

MCP is like giving AI assistants a standardized set of "plugins" to interact with your systems—read your analytics, manage your content, check your SEO scores—all through secure, defined interfaces.

Closed-Loop AI Optimization

The combination of outbound discovery (ADP) with inbound tracking (AI Visibility Engine) creating a feedback loop for continuous optimization. Publish content → ADP makes it discoverable → Track which AI systems visit → Measure citation referrals → Optimize content based on data → Republish. This closed loop enables data-driven AI SEO rather than guesswork.

Seven-Layer AI Discovery

Pressonify's comprehensive AI discovery architecture comprising seven layers of optimization: (1) Schema.org JSON-LD structured data, (2) Knowledge Graph entity relationships, (3) llms.txt AI context, (4) ai-discovery.json meta-index, (5) IndexNow instant notification, (6) AI crawler permissions in robots.txt, and (7) AI Visibility tracking and analytics. Together these layers ensure maximum discoverability and measurable AI engagement.

🤖 AI & Machine Learning

Agentic AI

Autonomous AI systems that can perceive environments, make decisions, and take actions to achieve specific goals without constant human intervention. Unlike passive tools that wait for commands, agentic AI proactively plans, executes, and adapts—like an assistant that doesn't just answer questions but independently solves problems.

Think of the difference between a calculator (tool that needs step-by-step instructions) and a personal assistant (agent that figures out the steps themselves).

Anti-Hallucination Engine

AI systems designed to prevent language models from generating false or invented information by implementing fact-checking and validation layers. These engines verify claims against reliable sources, detect inconsistencies, and flag low-confidence outputs before publication.

Imagine a fact-checker reading over AI's shoulder, catching when it starts to "make things up" (hallucinate) before publishing.

Citation (AI Citation)

When AI systems like ChatGPT, Claude, or Perplexity reference and link to your content as a source when answering user queries. AI citations are becoming as valuable as backlinks for visibility—if AI models cite your content frequently, it reaches millions of users. Structured data dramatically increases citation likelihood by making content easier for AI to parse and verify.

Entity

A distinct, well-defined "thing" in knowledge graphs—such as a person, place, company, product, or concept. Entities have unique identifiers and relationships to other entities. For example, "Apple Inc." is an entity connected to entities like "Tim Cook" (CEO), "iPhone" (product), and "Cupertino" (location). Search engines and AI use entity recognition to understand content meaning.

Imagine a web of facts where "Apple (company)" connects to "Steve Jobs" and "iPhone"—search engines and AI use these entity connections to understand relationships between things.

Entity Authority

The level of trust and recognition search engines and AI systems assign to specific entities (people, companies, brands). Entities with high authority—like established companies with Wikipedia entries, consistent structured data, and reputable mentions—get preferential treatment in search results and AI citations. Building entity authority requires consistent, structured presence across the web.

Fact Verification

Automated processes that check claims, statistics, and statements against reliable sources to ensure content accuracy. Fact verification systems cross-reference data points with authoritative databases, detect contradictions, and assign confidence scores. Essential for preventing the spread of misinformation and maintaining content credibility.

Fraud Detection

AI-powered systems that analyze patterns and signals to identify potentially fraudulent or spam content before publication. Fraud detection examines language patterns, domain reputation, company verification status, and behavioral signals. Protects platform integrity by preventing scams, fake companies, and misleading announcements.

LLM (Large Language Model)

AI models trained on vast amounts of text data that can understand and generate human-like text, such as ChatGPT (OpenAI), Claude (Anthropic), and Gemini (Google). LLMs power modern AI assistants, content generation tools, and increasingly, search engines themselves. They're trained on internet content, which is why structured, well-formatted press releases become "LLM brain food."

LLM Brain Food

Content optimized with structured data that AI language models can easily parse, index, and cite when responding to user queries. Just as humans prefer organized information over scattered notes, LLMs massively prefer content with JSON-LD schemas, semantic HTML, and clear metadata. This structured content becomes part of AI knowledge bases, increasing citation likelihood.

Think of it like the difference between feeding someone a scattered pile of papers versus a perfectly organized binder with labeled sections—one is soup for AI brains, easy to digest and remember.

Multi-Agent System

Architecture where multiple AI agents work together, each specialized in specific tasks, coordinating to accomplish complex objectives. For example, Pressonify uses separate agents for content generation, SEO optimization, fraud detection, fact-checking, and journalist matching. Multi-agent systems excel at problems too complex for a single AI model.

Prompt Engineering

The practice of designing effective instructions (prompts) for AI language models to produce desired outputs. Good prompts provide context, specify format, include examples, and set constraints. Prompt engineering is crucial for AI-generated content quality—vague prompts produce vague results, while precise prompts produce precise results.

RAG (Retrieval-Augmented Generation)

An AI technique where language models first retrieve relevant information from external sources (like databases or indexed web content) before generating responses. RAG reduces hallucinations by grounding AI outputs in real data. This is how AI search engines like Perplexity work—they retrieve your structured press releases, then generate answers while citing sources.

Training Data

The vast corpus of text, images, and information used to teach AI models during their initial training phase. Most LLMs train on internet content, including press releases, news articles, and structured data. Well-formatted, authoritative press releases become part of this training data, helping AI models learn about your company and industry. This is why press releases are now "AI training data," not just news.

📊 Structured Data & Schemas

JSON-LD Schema

JavaScript Object Notation for Linked Data—a method of encoding structured data using Schema.org vocabulary that search engines and AI models can easily parse. JSON-LD is Google's recommended format for implementing structured data, embedded in script tags in the page head. It tells search engines "this is a NewsArticle with headline X, published on date Y, by author Z."

Knowledge Graph

A database of interconnected entities and relationships that search engines and AI systems use to understand and answer complex queries. Knowledge graphs power Google's information panels on the right side of search results and enable AI to answer questions like "Who is the CEO of [Company]?" by understanding entity relationships.

Imagine a web of facts where "Apple (company)" connects to "Steve Jobs" (founder), "iPhone" (product), and "Cupertino" (location)—search engines and AI use these connections to understand context and relationships.

NewsArticle Schema

Schema.org type specifically for news content that includes properties like headline, datePublished, author, and articleBody. NewsArticle schema enables rich results in Google News, Top Stories carousels, and AI-powered news aggregators. Press releases with proper NewsArticle markup get preferential treatment in news-focused search results.

Schema.org

A collaborative vocabulary created by Google, Microsoft, Yahoo, and Yandex for structured data markup on web pages. Schema.org provides hundreds of types (NewsArticle, Organization, Product, Person) and properties to describe content semantically. It's the universal language for telling search engines and AI what your content means.

Semantic HTML

HTML markup that conveys meaning about content structure using appropriate tags like <article>, <section>, <header>, and <nav> instead of generic <div> tags. Semantic HTML helps search engines and accessibility tools understand page structure and content relationships. It's part of making content "machine-readable."

Structured Data

Machine-readable information organized in a standardized format using Schema.org vocabulary that helps search engines and AI understand content context. Structured data transforms unstructured HTML into organized, queryable information that powers rich results, knowledge panels, and AI citations. It's the difference between "some text" and "this is a company name: [value]."

⚙️ Technical Excellence

Analytics Ready

Web pages configured with tracking codes and event listeners to measure user behavior, traffic sources, and conversion metrics. Analytics integration enables data-driven optimization by showing what content works, where visitors come from, and which actions they take. Essential for measuring press release ROI.

API (Application Programming Interface)

A set of rules and protocols that allows different software applications to communicate with each other. APIs enable integrations—for example, a press release platform's API might let you submit releases programmatically from your CMS, or pull analytics data into your dashboard. APIs power modern software ecosystems.

Bounce Rate

The percentage of visitors who leave a website after viewing only one page without taking any action. High bounce rates often indicate poor content relevance, slow loading, confusing navigation, or mismatched user intent. For press releases, lower bounce rates suggest readers find the content valuable and explore further.

CDN (Content Delivery Network)

A distributed network of servers that caches and delivers website content from locations geographically closer to users, dramatically improving loading speeds. CDNs reduce server load, improve Core Web Vitals, and provide DDoS protection. Essential for global press release distribution with consistent fast loading worldwide.

Conversion

When a user completes a desired action on your website, such as signing up for a newsletter, downloading a whitepaper, requesting a demo, or making a purchase. Conversion rate is the percentage of visitors who convert. Press releases aim to drive conversions by attracting qualified traffic and compelling readers to act.

CTR (Click-Through Rate)

The percentage of people who click on a link after seeing it, calculated as clicks divided by impressions. High CTR indicates compelling headlines, meta descriptions, and preview text. For search results, CTR above 5% is good; above 10% is excellent. Featured snippets and rich results dramatically improve CTR.

DNS (Domain Name System)

The internet's phone book that translates human-readable domain names (pressonify.ai) into IP addresses (192.0.2.1) that computers use to locate servers. DNS records verify domain ownership and control email routing. Domain verification for press releases often involves checking DNS records or TXT records.

HTTP / HTTPS

HyperText Transfer Protocol—the foundation of data communication on the web. HTTPS (HTTP Secure) adds SSL/TLS encryption for secure data transmission. Modern websites must use HTTPS for security, user trust, and SEO (Google penalizes non-HTTPS sites). The padlock icon in browsers indicates HTTPS connections.

Impressions

The number of times your content appears in search results, social feeds, or other platforms, regardless of whether users click. Impressions measure visibility reach—a press release with 10,000 impressions was seen 10,000 times. Compare impressions to clicks (CTR) to gauge headline and preview effectiveness.

Mobile Responsive

Web design that automatically adapts layout and content to display optimally across devices of all screen sizes, from smartphones to desktop computers. Mobile responsiveness is a critical ranking factor as over 60% of searches occur on mobile devices. Non-responsive sites provide poor user experience and rank lower.

SSL Secure

Websites using HTTPS protocol with SSL/TLS certificates to encrypt data transmission, indicated by a padlock icon in browsers. SSL security is mandatory for modern websites and affects both user trust and search rankings. Google explicitly penalizes non-HTTPS sites in search results.

Verification & Security

Domain Verified

Confirmation that a user controls a specific domain through email verification, DNS records, or other authentication methods to prevent spam and fraud. Domain verification ensures only legitimate business representatives can publish press releases on their company's behalf. This prevents competitors or bad actors from publishing false announcements under your company name.