Our companies are hiring.

Work for one of Pelion's portfolio companies.

Senior Data Engineer

UpGuard

UpGuard

Data Science
Sydney, NSW, Australia
Posted on Nov 5, 2025
Who are we?
UpGuard’s mission is to make life easier for security teams. We meticulously create robust solutions that enable our customers to identify, assess, and remediate cybersecurity risk across their attack surface, vendor ecosystem, workforce, and trust relationships. Our integrated cyber risk posture management platform combines comprehensive security ratings, instant risk assessments, templated security questionnaires, threat intelligence capabilities, and agentic AI to give organizations a holistic view of their risk surface.
Our Analytics team at UpGuard supports all of our teams to extract key insights from the data that we collect. We extract data from multiple sources and build insightful dashboards to display key metrics and support leaders to make decisions. We also build integrations between systems to seamlessly collect required data.
Why are we hiring for this role?
As UpGuard continues to scale at an increasing rate, each of our teams, from Product to Sales, are becoming more data and insight hungry. As a result, our central Analytics function is bandwidth constrained and needs to scale with our business growth. In essence, with the emergence of automation and AI, we are looking to democratise data insights and analytics and to do this we need to enable our users with access to data, while we still ensure data integrity, security and governance are maintained. This role will be a key enabler to achieve this vision and accelerate our efforts in this area.

What you'll do?

  • Data Integration & Pipeline Development: Design, build, and maintain reliable data pipelines to consolidate information from various internal systems (CRM, support platforms, marketing, product usage data, etc.) and relevant third-party sources
  • Semantic Layer Design & Management: Develop and manage a comprehensive semantic layer (potentially using technologies like LookML, dbt or SQLMesh, Agents) that provides clear definitions and mappings for data concepts to underlying database structures as well as high level business logic.
  • Data Quality & Governance: Implement and enforce data quality checks, validation rules, and governance processes to ensure the accuracy, consistency, and reliability of data. Be mindful of how you handle data in a secure way (we’re a cybersecurity company).
  • AI & Agent Data Enablement: Ensure AI agents have access to the necessary structured and unstructured data (Knowledge Bases, support tickets, documentation, internal data, pricing info, marketing content, etc.) required for them to perform tasks like answering questions, qualifying leads, preparing for calls, and identifying opportunities.
  • Documentation & Knowledge Sharing: Create clear, self-maintaining documentation for data models, pipelines, and the semantic layer to enable both human users and AI agents to effectively interact with the data warehouse.

What you'll bring?

  • You can demonstrate an AI first mindset and that you’ve scaled an Analytics and BI function at another SaaS business in the past as the company grows, using cutting edge tools and technology.
  • 5+ years of experience with data sourcing, storage and modelling to effectively deliver business value right through to BI platform.
  • A can-do attitude and drive to build innovative, out-of-the-box solutions using technologies that you may be unfamiliar with.
  • A security-first mindset - we are a cybersecurity company!

What would give you an edge?

  • Looker is our main BI platform: a candidate should be familiar with Explores, Looks, Dashboards and the Developer interface - maintaining dimensions and measures, create and maintain models, and be able to run and debug raw SQL queries. Be able to edit and deploy custom visualizations using Looker’s extension SDK. Interface with Looker’s API.
  • CloudSQL (PostgreSQL) and BigQuery for storage and query: be comfortable with writing complex queries, indices, materialised views, clustering, partitioning, etc.
  • Containers, Docker and Kubernetes (via GKE) are used to run our internal automations. Being familiar and confident being a Platform Engineer in our GCP project.
  • Be familiar with technologies like n8n to automate processes and some experience with programming languages as our non-standard ETL workers are written in Go.
  • Be comfortable interfacing with a variety of APIs from our vendor platforms (either REST+JSON or MCP Server, depending on what a vendor platform throws at you).
  • Have experience with version control via GitHub, and a basic review process, like GitHub Flow.
  • If all else fails, being comfortable and confident falling back to a custom built solution, just to get data in the right shape or form to enable the Analyst to continue their work.
#LI-SR1
UpGuard is a Certified Great Place to Work® in the US, Australia, UK and India, establishing its position as a leading global technology employer. 99% of team members agree that UpGuard is a great place to work, apply now to find out why!
As an Equal Employment Opportunity and Affirmative Action Employer, qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender perception or identity, national origin, age, marital status, protected veteran status, or disability status.
For applications to positions in the United States, please note, at this time we can only support hiring in the following US states: CA, MD, MA, IL, OR, WA, CO, TX, FL, PA, LA, MO, or DC.
Before starting work with us, you will need to undertake a national police history check and reference checks. Also please note that at this time, we cannot support candidates requiring visa sponsorship or relocation.