If your agent fetches a web page, scrapes data, or previews a
website it just generated, the sandbox is where that happens. Each
browser session runs in its own microVM — no cross-tenant cookie
leakage, no shared Chromium profile, no “oops, the last agent left
a modal open.”
Headless browser — Playwright
from podflare import Sandbox
import base64
sbx = Sandbox(template="python-datasci", idle_timeout_seconds=900)
# Install Playwright + Chromium
sbx.run_code("pip install -q playwright && playwright install chromium")
r = sbx.run_code("""
from playwright.sync_api import sync_playwright
import base64
with sync_playwright() as p:
br = p.chromium.launch()
page = br.new_page()
page.goto('https://news.ycombinator.com')
titles = page.locator('.titleline > a').all_inner_texts()[:5]
print('\\n'.join(titles))
img = page.screenshot()
print('SCREENSHOT_B64:' + base64.b64encode(img).decode())
br.close()
""")
text, _, png_b64 = r.stdout.partition("SCREENSHOT_B64:")
print("Headlines:\n", text)
print("Screenshot bytes:", len(png_b64))
sbx.close()
Real-world uses
- Research agents that gather links + screenshots for a report.
- Accessibility audits — launch Chromium, inject
axe-core,
return violations.
- Form-filling bots for routine admin work on sites without an
API (expense reports, government portals, etc.).
- “Visual diff” on generated UIs — screenshot before and after
a change, pipe both to a vision model.
First-time playwright install downloads ~120 MB. Do it in the
first agent turn and keep the sandbox alive for the session so
subsequent turns skip the download.
Boot a Next.js dev server, screenshot the result
This is the pattern for v0.dev-style “AI-generated websites.” The
agent writes React files, the sandbox runs next dev, Playwright
takes a screenshot so the user can see what the agent built.
from podflare import Sandbox
sbx = Sandbox(idle_timeout_seconds=1800)
# 1. Scaffold + install
sbx.run_code("""
mkdir -p /tmp/site && cd /tmp/site
cat > package.json <<'PKG'
{"dependencies":{"next":"^15","react":"^19","react-dom":"^19"}}
PKG
mkdir -p app
cat > app/page.tsx <<'TSX'
export default function P() {
return <h1>hi from sandbox</h1>;
}
TSX
cat > app/layout.tsx <<'TSX'
export default function L({children}:{children:React.ReactNode}) {
return <html><body>{children}</body></html>;
}
TSX
npm install --silent
""", language="bash")
# 2. Start dev server in background
sbx.run_code(
"cd /tmp/site && nohup npx next dev -p 3000 > /tmp/next.log 2>&1 &",
language="bash",
)
# 3. Screenshot
sbx.run_code("pip install -q playwright && playwright install chromium -q")
r = sbx.run_code("""
from playwright.sync_api import sync_playwright
import base64, time
time.sleep(3) # give next a beat
with sync_playwright() as p:
br = p.chromium.launch()
page = br.new_page()
page.goto('http://localhost:3000')
img = page.screenshot()
print('B64:' + base64.b64encode(img).decode()[:120] + '…')
br.close()
""")
print(r.stdout)
sbx.close()
Why isolation matters
- Cookies + localStorage. Fresh per sandbox. Nothing to leak
between customers.
- Downloaded files.
/tmp is the sandbox’s, not yours. When
the sandbox dies, so do the downloads.
- Long-running headless Chromium. Chromium likes to eat RAM
and sometimes crash. The host is never affected; your box dies,
your box respawns, the user doesn’t see it.
Pitfalls
- Headless Chromium is memory-hungry. 1 GB free-tier RAM is
tight — expect OOMs on heavy pages. Pro (4 GB) is the happy
place.
page.wait_for_load_state('networkidle') is more reliable
than time.sleep(3). Use it for agent-generated pages where you
don’t know how long hydration takes.
- Don’t
pip install playwright per turn. Install once at
sandbox start; subsequent turns reuse the venv.