Job Description
Secure Every Identity, from AI to Human
Identity is the key to unlocking the potential of AI. Okta secures AI by building the trusted, neutral infrastructure that enables organizations to safely embrace this new era. This work requires a relentless drive to solve complex challenges with real-world stakes. We are looking for builders and owners who operate with speed and urgency and execute with excellence.
This is an opportunity to do career-defining work. We're all in on this mission. If you are too, let's talk.
Auth0 is an easy-to-implement, authentication and authorization platform designed by developers for developers. We make applications’ login boxes safe, secure, and seamless for anyone logging in.
The Auth0 Data Engineering Team
Within Auth0, the Data Engineering team builds solutions to support analytics needs for the whole organization and is in charge of the Data Platform. It is divided into 3 groups:
- Pipeline team, sitting closer to the Platform team and data producers and in charge of the efficient data ingestion and provision of quick access to unmodeled data
- Warehouse team sitting closer to the business teams and data consumers and in charge of modeling the data in the data warehouse to abstract away the complexity for data consumers, simplifying the organization’s analysis, reporting and decision-making
- Interface team responsible for creating and managing connections between the data platform and various external systems (internal or external facing), making sure everyone has access to consistent and reliable data
The Senior Data Engineer Opportunity
Reporting to the manager of the Data Engineering Pipeline team, the senior data engineer will be a key player in ensuring the reliability and efficiency of our core data platform. This role is crucial, providing the stable data foundation that not only 'runs the business' day-to-day but also directly empowers the organization to unlock growth and build innovative new products. We are looking for an autonomous and proactive engineer who can take ownership of our data ingestion pipelines, work closely with platform teams and data producers, and onboard new data from various internal and external systems. You will be part of a team creating robust, scalable data solutions using a modern, best-in-class toolset.
What you’ll be doing
- Own and operate our critical data pipelines and infrastructure, participating in the team's support rotation to ensure high availability and business continuity.
- Lead troubleshooting efforts to fix complex pipeline or infrastructure issues, demonstrating a willingness to go beyond the direct scope of the team to find resolutions.
- Manage the data onboarding lifecycle, working with stakeholders to integrate new internal and external data sources into our platform using our suite of tools.
- Develop and deploy robust, scalable data solutions using modern tools and technologies like Snowflake, dbt, Airflow, and Terraform.
- Continuously learn and advocate for modern technologies and best practices to improve data delivery and engineering efficiency.
What you’ll bring to the role
- 5+ years of software development experience, with at least 3+ years working on large-scale data systems.
- Strong proficiency in SQL for data manipulation and Python for data pipeline development.
- Hands-on experience building and operating data pipelines in a cloud environment (AWS preferred).
- Hands-on experience with the modern data stack, specifically:
- Data Warehousing: Snowflake
- Data Transformation: dbt
- Infrastructure as Code: Terraform
- Experience with data ingestion, streaming, and event-driven architecture (e.g., Kafka, CDC, Snowpipe).
- Knowledge of data security and data privacy best practices
- A strong understanding of data modeling principles
- data security best practices.
- A highly autonomous and proactive mindset. You are a quick learner who can self-motivate, prioritize needs, and deliver results in a dynamic environment.
And extra credit if you have experience in any of the following!
- Experience with Snowpark.
- Experience with containerization and orchestration (Kubernetes is a nice to have).
- Familiarity with the Identity Access & Management (IAM) domain.
- Experience building developer-friendly, API-driven applications using REST and/or gRPC.
#LI_Hybrid
P24587_3348467
The Okta Experience
- Supporting Your Well-Being
- Driving Social Impact
- Developing Talent and Fostering Connection + Community
We are intentional about connection. Our global community, spanning over 20 offices worldwide, is united by a drive to innovate. Your journey begins with an immersive, in-person onboarding experience designed to accelerate your impact and connect you to our mission and team from day one.
Okta is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, ancestry, marital status, age, physical or mental disability, or status as a protected veteran. We also consider for employment qualified applicants with arrest and convictions records, consistent with applicable laws.
If reasonable accommodation is needed to complete any part of the job application, interview process, or onboarding please use this Form to request an accommodation.
Notice for New York City Applicants & Employees: Okta may use Automated Employment Decision Tools (AEDT), as defined by New York City Local Law 144, that use artificial intelligence, machine learning, or other automated processes to assist in our recruitment and hiring process. In accordance with NYC Local Law 144, if you are an applicant or employee residing in New York City, please click here to view our full NYC AEDT Notice.
Okta is committed to complying with applicable data privacy and security laws and regulations. For more information, please see our Personnel and Job Candidate Privacy Notice at https://www.okta.com/legal/personnel-policy/.
