Data Engineer

Andela

Andela

Software Engineering, Data Science
United States
Posted on Apr 8, 2026

Location

United States

Employment Type

Full time

Location Type

Remote

Department

Product & EngineeringData

Data Engineer
About Andela:
At Andela, we know brilliance is evenly distributed around the world, but opportunity is not. For over 10 years, Andela has connected its customers with top global, remote technical talent from over 135 countries with the majority residing in emerging markets like Africa and Latin America.

As one of the world’s largest talent marketplaces, Andela gives companies greater flexibility to quickly deploy qualified technologists. With talent highly skilled in advanced technologies to support Application Development, Artificial Intelligence, Cloud & DevOps, Data Engineering, and much more, customers experience 33% faster project delivery. The company’s exclusive AI-powered platform, Andela Talent Cloud, is the industry’s only unified platform managing the complete global talent lifecycle and enables customers to fill individual roles or engage fully managed teams up to 66% faster.

Andela is on the precipice of two breakout industry transformations: one in staffing/hiring and the other in software development, both accelerated by generative AI.

Are you an exceptional, hungry leader seasoned in scaling businesses through transformation and growth? Join us and change the world.

Job Summary:

Andela is transitioning from a world-class talent marketplace into a high-scale, AI-integrated Talent Cloud. As a Data Engineer you will build and maintain the data infrastructure that powers Andela's business intelligence and analytics capabilities. This role requires someone who thinks about data as a product, ensuring data is reliable, accessible, and structured to enable self-service analytics across the organization

Exceptional Leadership:

As an Andelan, you’ll serve as a role model for the rest of the company. Think about the feedback your peers typically give you – if it usually sounds like the below, we want to hear from you.

  • Low ego, low drama, servant leader: You share credit, take blame. You like being wrong because it means someone else had an even better idea.

  • One team mentality: You break silos across teams. You put the company and mission first above your team alone.

  • Great listener, hungry for feedback: You’re always seeking to improve – our product, our business, yourself. You solicit diverse opinions and deeply listen.

  • Owner, not renter: You see a problem, you fix it or find someone who will. The buck stops with you.

  • Player-coach: You fly high (create strategy) AND low (know the details that matter). You roll up your sleeves and get scrappy. You do this proactively collaborating with your team while actively engaging in important details.

  • Business problem solver: You’re not just a functional expert; you consistently get praise for approaching your function through the lens of solving business problems.

Key Responsibilities:

Data Infrastructure

  • Design, build, and maintain scalable data pipelines and ETL processes that ensure data quality and reliability

  • Manage data warehouse architecture and optimization for performance and cost-efficiency

  • Implement data governance standards, security protocols, and monitoring systems

  • Own data ingestion from multiple sources and ensure seamless integration into analytics platforms

Enabling Self-Service Analytics

  • Build data models and semantic layers that enable business users to access insights independently

  • Create and maintain data documentation, lineage, and data dictionaries

  • Partner with BI Analysts to understand data needs and proactively improve data availability

  • Implement automated data quality checks and alerting systems

Technical Excellence

  • Utilize tools like Composer, Terraform, Airflow, Fivetran, dbt to build robust data infrastructure

  • Optimize SQL queries and warehouse performance for analytical workloads

  • Stay current with modern data engineering best practices and tools

Qualifications:

  • 3+ years experience in data engineering roles. Expert-level SQL skills and experience with data warehousing (BigQuery preferred)

  • Proficiency with data pipeline tools (Airflow, Terraform, Fivetran, dbt, or similar)

  • Strong proficiency in Python, with a focus on writing clean, maintainable code for data pipeline development, API integrations, and ETL/ELT processes.

  • Proven ability to leverage AI-driven development tools to accelerate pipeline construction, automate documentation, and perform rapid root-cause analysis of data quality issues.

  • Strong understanding of data modeling principles and dimensional design

  • Experience with data governance, security, and quality frameworks

  • Ability to translate business requirements into technical data solutions

  • Experience with the Looker semantic layer (LookerML) to define business logic and create self-service reporting environments is highly desirable.

  • Bachelor's degree in Computer Science, Engineering, or related technical field preferred

#LI-REMOTE
#LI-RDR