Accepting Applications
Full-time
Remote
Posted 1 hour, 10 minutes ago
0 views
0 applications
Job Description
GNO Partners \| Remote (Global) \| Mid\-Level (3–5 years)
About GNO Partners
GNO Partners helps Amazon sellers run smarter, more profitable businesses. We're building a platform that aggregates Amazon report data into actionable insights and pushes optimizations back to Amazon at scale. Our next major milestone is replacing manual report uploads with direct integrations to Amazon's SP\-API and Ads API — and that's where this role comes in.
What You'll Do
This role is full\-stack, but with a clear lean toward data engineering. You'll:
* Design and build the data pipelines that pull from Amazon SP\-API and Ads API into our Postgres warehouse — handling rate limits, retries, schema evolution, and the messy realities of vendor APIs.
* Architect ingestion, transformation, and aggregation layers that power our reporting tools.
* Build new report tools end\-to\-end alongside the rest of the engineering team — you're not only on pipelines.
* Help us scale our data layer as we move from per\-client manual uploads to automated, multi\-tenant data flow.
* Contribute to our AI insights layer — feeding clean, structured data into LLM\-powered analysis.
What We're Looking For
* 3–5 years of full\-stack experience, with a demonstrable track record of building data pipelines in production. We want to see real examples — pipelines you designed, problems you solved, scale you handled.
* Strong with TypeScript, Node.js, NestJS, and React.
* Deep comfort with Postgres / Supabase — partitioning, indexing, query optimization, handling large datasets.
* Hands\-on experience with AWS: S3, Lambda, SNS, SQS, EC2\. Bonus for orchestration tools (Step Functions, EventBridge, Airflow, Temporal, etc.).
* Experience integrating with third\-party APIs at scale — pagination, rate limiting, backfills, incremental sync.
* Basic familiarity with AI agentic systems — you've worked with or explored LLMs, tool\-use, or agent frameworks.
* Pragmatic and product\-aware — you understand pipelines exist to serve user\-facing features, not for their own sake.
Nice to Have
* Direct experience with Amazon SP\-API and/or Ads API.
* Background with ETL frameworks, streaming systems (Kafka, Kinesis), or workflow engines (Temporal, Airflow).
* Experience with data quality tooling, observability, or pipeline monitoring.
Why Join
* You'll own the data backbone of a platform that's actively scaling.
* High\-leverage work — every pipeline you build directly enables new product surface area.
* Clear path to work across pipelines, product, and AI as the platform evolves.
More jobs from GNO Partners
Login to Apply
Don't have an account? Register