Consuming Product Feeds
Integrate with Drapier product feeds in XML, CSV, or JSON format to keep your catalog up to date.
Product feeds give you access to the full catalog of products available on Drapier. Feeds are available in XML (Google Shopping format), CSV, and JSON. Each publisher gets a unique feed URL with pre-built deep links — every product link in the feed is already a trackable affiliate link tied to your publisher account.
For feed field definitions and format details, see Product Feeds — Concepts. For API endpoint details, see Feeds API Reference.
Feed URL format
https://api.drapier.io/api/v1/feeds/{publisherId}/products.{format}?key={apiKey}| Parameter | Description |
|---|---|
publisherId | Your publisher ID (e.g. pub_123) |
format | xml, csv, or json |
key | Your API key (required) |
Filtering feeds
Append query parameters to narrow the feed to specific products:
| Parameter | Type | Description |
|---|---|---|
brand | string | Filter by brand name (e.g. Gucci) |
category | string | Filter by product category |
minPrice | number | Minimum sale price in USD |
maxPrice | number | Maximum sale price in USD |
onlyInStock | boolean | Exclude out-of-stock items (default true) |
Example — Gucci products under $500:
https://api.drapier.io/api/v1/feeds/pub_123/products.json?key=sk_live_abc123&brand=Gucci&maxPrice=500XML feed integration
The XML feed follows the Google Shopping (Merchant Center) specification, making it compatible with most affiliate tools and comparison shopping engines.
Python
import requests
import xml.etree.ElementTree as ET
FEED_URL = "https://api.drapier.io/api/v1/feeds/pub_123/products.xml"
API_KEY = "sk_live_abc123"
resp = requests.get(FEED_URL, params={"key": API_KEY})
resp.raise_for_status()
root = ET.fromstring(resp.content)
ns = {"g": "http://base.google.com/ns/1.0"}
for item in root.findall(".//item"):
title = item.find("title").text
price = item.find("g:price", ns).text
link = item.find("link").text
brand = item.find("g:brand", ns).text
print(f"{brand} — {title}: {price}")
print(f" Link: {link}")JavaScript (Node.js)
import { parseStringPromise } from "xml2js";
const FEED_URL =
"https://api.drapier.io/api/v1/feeds/pub_123/products.xml";
const API_KEY = "sk_live_abc123";
const resp = await fetch(`${FEED_URL}?key=${API_KEY}`);
const xml = await resp.text();
const result = await parseStringPromise(xml);
const items = result.rss.channel[0].item;
for (const item of items) {
console.log(`${item.title[0]}: ${item["g:price"][0]}`);
console.log(` Link: ${item.link[0]}`);
}CSV feed integration
The CSV feed is a flat file with one row per product variant. It uses comma delimiters and UTF-8 encoding.
import csv
import io
import requests
FEED_URL = "https://api.drapier.io/api/v1/feeds/pub_123/products.csv"
API_KEY = "sk_live_abc123"
resp = requests.get(FEED_URL, params={"key": API_KEY})
resp.raise_for_status()
reader = csv.DictReader(io.StringIO(resp.text))
for row in reader:
print(f"{row['brand']} — {row['title']}: {row['price']} {row['currency']}")
print(f" Link: {row['link']}")JSON feed integration
The JSON feed returns a structured array, ideal for building custom UIs or importing into a database.
const FEED_URL =
"https://api.drapier.io/api/v1/feeds/pub_123/products.json";
const API_KEY = "sk_live_abc123";
const resp = await fetch(`${FEED_URL}?key=${API_KEY}`);
const { products } = await resp.json();
for (const product of products) {
console.log(`${product.brand} — ${product.title}`);
console.log(` Price: $${product.salePrice}`);
console.log(` Link: ${product.link}`);
console.log(` Image: ${product.imageUrl}`);
}JSON product object fields:
| Field | Type | Description |
|---|---|---|
id | string | Product ID |
title | string | Product title |
brand | string | Brand name |
category | string | Product category |
salePrice | number | Current sale price (USD) |
currency | string | Always USD |
link | string | Pre-built deep link with your publisher ID |
imageUrl | string | Primary product image URL |
description | string | Product description |
availability | string | in_stock or out_of_stock |
condition | string | new or pre_owned |
Caching strategy
Feeds update every 4 hours as products are added, priced, or marked out of stock.
Use ETag headers for efficient polling:
import requests
FEED_URL = "https://api.drapier.io/api/v1/feeds/pub_123/products.json"
API_KEY = "sk_live_abc123"
etag = None
def fetch_feed():
global etag
headers = {}
if etag:
headers["If-None-Match"] = etag
resp = requests.get(FEED_URL, params={"key": API_KEY}, headers=headers)
if resp.status_code == 304:
print("Feed unchanged, using cached version")
return None
etag = resp.headers.get("ETag")
return resp.json()Recommended refresh schedule: every 4 hours during business hours, once overnight.
Feed schema reference
For the full XML schema specification and field definitions, see Product Feeds — Concepts.