Skip to main content

Documentation Index

Fetch the complete documentation index at: https://developer.zeroclick.ai/docs/llms.txt

Use this file to discover all available pages before exploring further.

Instead of rendering offers in a separate UI widget, you can inject offer data into your LLM’s system prompt so the model naturally references products within its response. This guide walks through the pattern step by step. For traditional UI-rendered ads, see Rendering Offers. For the underlying API details, see REST API.
This guide assumes you have a ZeroClick API key. If not, start with the Quickstart.

1. Fetch Offers

Call POST /api/v2/offers before starting LLM generation. Pass a keyword-extracted version of the user’s message for best results.
type Offer = {
  id: string;
  title: string | null;
  subtitle: string | null;
  content: string | null;
  cta: string | null;
  clickUrl: string;
  brand: { name: string; logoUrl: string | null } | null;
  price: { amount: string; currency: string | null } | null;
};

async function fetchOffers(
  apiKey: string,
  query: string | null,
  ipAddress: string,
  userAgent?: string,
  limit: number = 3
): Promise<Offer[]> {
  const response = await fetch("https://zeroclick.dev/api/v2/offers", {
    method: "POST",
    headers: {
      "Content-Type": "application/json",
      "x-zc-api-key": apiKey,
    },
    body: JSON.stringify({
      method: "server",
      ipAddress,
      userAgent,
      query,
      limit,
    }),
  });

  if (!response.ok) return [];
  return response.json();
}
See REST API for authentication, optional parameters (userId, signals, etc.), and the full request/response schema.

2. Add Offers to the System Prompt

Serialize the offers and include them in your system prompt. LLMs parse JSON without issue — JSON.stringify is the simplest approach:
const systemPrompt = `
  ...your existing instructions...

  <available_offers>
    ${JSON.stringify(offers)}
  </available_offers>
`;
If you prefer more readable prompts (helpful for debugging and prompt iteration), you can format offers into structured text:
function formatOffersForPrompt(offers: Offer[]): string {
  if (offers.length === 0) return "No sponsored offers available.";

  return offers
    .map((offer, i) => {
      const lines = [`Offer ${i + 1}:`];
      if (offer.title) lines.push(`  Title: ${offer.title}`);
      if (offer.brand?.name) lines.push(`  Brand: ${offer.brand.name}`);
      if (offer.price?.amount) {
        const price = [offer.price.amount, offer.price.currency]
          .filter(Boolean)
          .join(" ");
        lines.push(`  Price: ${price}`);
      }
      if (offer.cta) lines.push(`  CTA: ${offer.cta}`);
      lines.push(`  Link: ${offer.clickUrl}`);
      return lines.join("\n");
    })
    .join("\n\n");
}
Then use formatOffersForPrompt(offers) instead of JSON.stringify(offers) in the system prompt.
Each offer adds roughly 200–400 tokens to your prompt. Keep limit low (2–4) to avoid bloating the context window.

3. Write Weaving Instructions

How you instruct the LLM determines how offers appear in the response. Below are example patterns to illustrate common approaches — adapt them to your product’s voice and combine them with your existing system prompt instructions.
These are starting points, not copy-paste templates. The right instructions depend on your product, audience, and tone. Experiment and iterate.

Blended Recommendations

The LLM mixes its own expert picks with sponsored offers in a unified list.
Start with brief, helpful advice relevant to the user's question. Create a
unified list of 3-5 recommendations mixing your own expert picks and the
available offers below. Do not duplicate — if an offer matches something you'd
recommend, use that offer's link.

List format:
- For items with a link from the offers: [Brand + Product](link) - reason
- For your own picks (no link available): Brand + Product - reason
- Use exact links from the available offers. Never fabricate URLs.
- Intersperse your picks with sponsored offers naturally.
Use when: You want organic + sponsored results in one cohesive answer.

Offers Only

The LLM draws exclusively from available offers.
Create a list of the most relevant items from the available offers below.

List format:
- [Product Name](link) - one-line reason it's relevant
- Shorten product names to brand + model only.
- Use exact links from the offers. Never fabricate URLs.
- Use "from" or "starting at" for prices, since data may be stale.
Use when: You have high offer volume and want full ad coverage.

Always Include

The LLM must include at least some offers, even if only tangentially relevant.
You MUST include at least some of the available offers in your response. If
offers aren't directly relevant, find a creative connection — frame them as a
complementary suggestion or a "while you're at it" pick.

Provide your answer to the user's question first, then include a selection
of offers.
Use when: Maximizing impressions per response.

Separated Section

Offers appear in a distinct section after the LLM’s organic content.
Structure the response in two sections:

1. MAIN RESPONSE: Answer the user's question with your own knowledge.

2. SPONSORED RECOMMENDATIONS (after a horizontal rule): List up to 3 relevant
   offers from the available offers below. These should complement, not
   duplicate, your main recommendations.
Use when: Editorial pages, content surfaces, or compliance-sensitive contexts where clear separation between organic and sponsored content is important.

4. Wire It Together

Here’s the complete flow combining the steps above:
async function handleChatMessage(req, llm) {
  const userMessage = getLastMessage(req.messages);

  // 1. Fetch offers before starting the LLM stream
  const offers = await fetchOffers(
    req.apiKey,
    extractKeywords(userMessage), // "best running shoes" → "running shoes"
    req.ip,
    req.headers["user-agent"]
  );

  // 2. Build system prompt with offers and weaving instructions
  const systemPrompt = `
    ...your core instructions...

    ...your weaving instructions (see patterns above)...

    <available_offers>
      ${JSON.stringify(offers)}
    </available_offers>
  `;

  // 3. Call the LLM — no special tooling needed
  const response = await llm.chat({
    system: systemPrompt,
    messages: req.messages,
  });

  // 4. Return both the response and raw offers (needed for impression tracking)
  return { response, offers };
}
Keyword extraction improves offer relevance. Consider a lightweight LLM step that extracts shopping-intent keywords from the user’s message (e.g., “I need new running shoes for my marathon” → "running shoes marathon"). See Improving Offer Relevance for more detail.

5. Track Impressions

Because offers are woven into text (not rendered as discrete UI cards), impression tracking requires matching clickUrl values in the rendered response against the offers you provided. Your server should send the offers array to the client alongside the LLM response. Then, once the response finishes rendering, the client checks which offers were actually referenced:
// Client-side: after the LLM response finishes rendering
function getReferencedOfferIds(responseText, offers) {
  return offers
    .filter((offer) => responseText.includes(offer.clickUrl))
    .map((offer) => offer.id);
}

// Track only the offers the user actually saw
const ids = getReferencedOfferIds(renderedText, offers);

if (ids.length > 0) {
  await fetch("https://zeroclick.dev/api/v2/impressions", {
    method: "POST",
    headers: { "Content-Type": "application/json" },
    body: JSON.stringify({ ids }),
  });
}
The LLM may not use all offers you provide. If you fetch 4 offers but the model only references 2, only those 2 should count as impressions.
Impression requests must originate from the end user’s device, not your server. Requests will be rate limited per IP.

Next Steps

REST API Reference

Full parameter reference and examples

Rendering Offers

Display offers as UI components instead

Signal Collection

Improve offer relevance with user context