Primetag is a fast-growing SaaS company in the influencer marketing analytics space. We help global brands and agencies make data-driven decisions about their influencer strategies, powered by our proprietary platform and market-leading database of over 3 billion content pieces across 50+ markets. Our clients rely on us to simplify, scale, and optimize their influencer marketing operations.
This is a senior, hands-on role for someone who views Data Infrastructure as a product. You will define how billions of records are structured, indexed, and exposed to the rest of the company.
Ideal for someone who combines strong backend engineering skills with deep expertise in data platforms, data lakes, and large-scale data systems.
We’re looking for a Data Platform Engineer to join the Core team at Primetag. The Core team owns our data foundations, backend systems, and shared infrastructure that power all products, analytics, and AI capabilities. You’ll work closely with Product, Backend, AI/R&D, and Analytics teams to turn research initiatives and experimental data work into production-ready, scalable systems.
You will not just manage databases, you will build the abstraction layer that powers our interfaces and AI models.
Database Reliability & Scaling
Own the health and performance of our core databases.
Own and optimize our MongoDB clusters and OpenSearch indexes, which houses billions of documents. You will design sharding strategies and indexing patterns to ensure search performance.
Design and implement a multi-tier storage strategy. You will determine which data remains "hot" in production databases for our Product team and which data is offloaded to "cold/analytical" storage for AI and R&D.
Data Access Layer
Build and maintain internal APIs that allow internal teams to build features without worrying about the underlying database complexities.
Schema Evolution & Migrations
Lead the strategy for updating data structures across billions of records. You will design "no-downtime" migration paths for our production MongoDB and OpenSearch environments.
Data Platform & Architecture
Own and evolve our Core Data Architecture, ensuring it supports analytics, product features, AI workflows, and internal consumption.
Evaluate the feasibility and ROI of introducing a Data Lakehouse architecture for long-term storage and AI training.
Define standards for how data is ingested, stored, versioned, and exposed to downstream systems.
Data Quality & Documentation
Ensure data accuracy, freshness, and consistency through validation and testing.
Maintain clear, up-to-date documentation of pipelines, schemas, and data assets to enable internal adoption.
Collaboration
Act as a bridge between Core, Product, and R&D, ensuring research initiatives (e.g. data enrichment) are integrated into the Core platform.
Translate business and product needs into scalable, maintainable data solutions.
Comfortable working in async-first environments, keeping task statuses and documentation up to date for full team visibility
5+ years of experience in Backend, Data Engineering and/or Data Platform roles, specifically within high-concurrency/high-volume environments.
NoSQL & Search Expert. Professional experience tuning MongoDB and OpenSearch/Elasticsearch at massive scale (sharding, cluster topology, query optimization).
Strong Python Skills. Proficiency in writing production-grade, highly efficient Python. Experience with asynchronous programming (FastAPI, etc.) for building high-performance Internal APIs is essential.
System Design & Architecture. A strong grasp of distributed systems, including eventual consistency, message queues (Kafka/RabbitMQ), and caching strategies to protect production databases.
Strategic Storage Thinking. Familiarity with Data Lakehouse concepts (e.g., Parquet, Delta Lake, Iceberg). Proven experience building one.
Excellent communication skills and attention to documentation.
Self-managed and comfortable working cross-functionally.
Fluent in English.
Tax residency in Portugal.
Experience with Azure Cloud services or equivalent.
Experience with orchestration tools like Airflow, Prefect, or similar.
Exposure to AI/ML workflows or supporting AI teams with data pipelines
Knowledge of Data Observability tools to monitor data quality and indexing lag at scale
Interest or experience in influencer marketing or social media analytics
Help lead a paradigm shift in how B2B influencer marketing is executed and scaled.
Join a mission-driven, innovation-focused company in a fast-growing sector.
Work with a talented, ambitious, and collaborative international team.
Benefit from flexibility, autonomy, and the opportunity to shape the future of B2B marketing in our industry.
The timing is perfect — join us during an exciting expansion phase, with new products and markets.
Grow and learn with us, supported by our veteran engineers and our domain experts.
Take part in our annual “Nomad Offices” in amazing locations.
Be part of a collaborative and multicultural culture, driven by product excellence and real impact.
Submit your application using this form.