Project Overview We are building a Next.js + Node.js + Vercel analytics SaaS for affiliates. Users connect affiliate networks via API or CSV upload, we normalise performance stats, and we provide dashboards, GEO heatmaps, anonymised and aggregated network benchmarks, and alerts.
The core challenge (and priority) is building a clean, reliable data ingestion and aggregation system that scales as more users and networks connect.
What You’ll Work On You will lead the backend/data engineering work for ingestion and analytics, including:
- Design the data model in Postgres for raw imports, normalised metrics, and aggregated reporting - Build API integrations (auth where needed, scheduled pulls, pagination, rate limits) - Build a CSV import pipeline (validation, mapping, deduplication, error handling, audit trail) - Implement background jobs for ingestion and processing (cron plus queues/workers) with retries, backoff, and idempotency - Build the aggregation layer for dashboards/heatmaps and anonymised benchmarks (privacy-safe rules for small sample sizes) - Expose clean API endpoints for the Next.js app (performance focused) - Add logging/monitoring and basic automated tests for pipeline reliability - Ensure sensible handling of sensitive data (secrets, access tokens, and user isolation)
Current Stack / Preferences
- Next.js (App Router) + Node.js runtime - Vercel deployment - Postgres (hosted provider is flexible) - ORM: Prisma or Drizzle (open to your recommendation) - Background jobs: cron + queue/worker approach (open to your recommendation)
Required Experience (Must Have)
- Strong Node.js/TypeScript backend experience (production SaaS preferred) - Deep Postgres skills (schema design, indexing, query optimisation, migrations) - Real background job experience (queues/workers, cron scheduling, retries, idempotency) - Proven experience integrating third-party APIs and handling messy/partial data - Ability to design systems that are clean, organised, and maintainable
Nice to Have
- Experience deploying Node/Next.js systems on Vercel or serverless environments - Experience building analytics/aggregation systems (materialized views, rollups, caching strategies) - Familiarity with privacy-safe aggregation (minimum sample thresholds, anonymisation rules) - Experience with affiliate platforms, iGaming, or performance marketing analytics - Observability tooling (Sentry, OpenTelemetry, structured logging)
Engagement
- Contract role (remote) - Start with an initial scope focused on ingestion + aggregation MVP, with potential for ongoing work - Please confirm you are comfortable with the milestone-based budget and timeline below - Deliverables are defined by the milestone acceptance criteria below
What Success Looks Like (Deliverables)
- Clear backend architecture for ingestion, processing, and aggregation - Working pipeline for CSV import and at least one API integration (with a pattern to add more) - Normalised metric layer (consistent definitions across sources) - Aggregated tables/endpoints powering dashboards + GEO heatmap - Foundation for anonymised benchmark calculations - Clean code structure, basic tests, and logging
How to Apply Please send:
- A short intro and 1–3 relevant projects you’ve shipped (links if possible) - Your preferred stack for Postgres + jobs (Prisma/Drizzle, cron/queues, ETL approach) - A brief outline of how you would design ingestion + deduplication + retries for API and CSV sources
Screening Questions (Answer briefly)
- Describe a pipeline you built. How did you handle retries, rate limits, and duplicate imports? - What’s your preferred approach for background jobs in a Next.js/Vercel setup? - How would you prevent anonymised benchmarks from leaking data in small GEO/brand sample sizes?
We are optimising for correctness and reliability over flashy UI. The data pipeline is the constraint. Please include one example of a data pipeline you shipped in production and what broke first.
---------------------------------
**See attached PDF for Milestones and detailed project overview**
Budget
- Timeline: Preferably within 3 months (Milestones 1 to 5 delivered on a rolling basis) - Payment: milestone-based, €1,200 per milestone (5 milestones) - Total budget: €6,000
Milestone payments are released as milestones are completed and accepted, not strictly one per month. Some milestones may be delivered in the same month depending on progress.
---------------------------------
Preferred applicants: Senior backend/data engineers with proven production experience in Node.js/TypeScript, Postgres, and background job systems (data pipelines, ETL, ingestion, rollups).
More ongoing work available after this project for the right candidate.
PDF to Word Yearbook Name Extraction Category: Data Entry, Data Extraction, Editing, Microsoft Office, Microsoft Word, PDF, Proofreading, Word Budget: $750 - $1500 USD
Thai Entrance Face Image Dataset Category: Data Collection, Data Management, Event Planning, Face Recognition, Image Processing, Lighting Design, Photography, Video Production Budget: $250 - $750 USD
29-Dec-2025 23:03 GMT
CNC CAD Drawing for Foam Category: 3D CAD, 3D Drafting, AutoCAD, CAD / CAM, CNC, CNC Programming, Product Design, Solidworks Budget: $30 - $250 AUD
Curate 1000-Image Research Dataset Category: BeautifulSoup, Data Collection, Data Management, Data Mining, Python, Research, Selenium, Web Scraping Budget: ₹12500 - ₹37500 INR
29-Dec-2025 22:53 GMT
Natural Disaster Crisis Communications PlanS Category: Brand Management, Business Writing, Content Writing, Management, Media Relations, Project Management, Public Relations, Research, Research Writing, Social Media Management Budget: $250 - $750 CAD
29-Dec-2025 22:53 GMT
Short-Form Instagram/TikTok Videos -- 2 Category: Adobe Premiere Pro, After Effects, Final Cut Pro, Motion Graphics, Social Media Marketing, Video Editing, Video Production, Video Services Budget: $250 - $750 USD