JetBridge logo

Data Engineer

JetBridgeRemote (Latin America)


No Relocation

Posted: February 17, 2026

Additional Content

Job Description
  • Help build the data foundation powering an AI platform that makes accessing healthcare 10x easier and more cost-effective. This is a well-funded healthcare technology company using AI and modern data infrastructure to transform how healthcare and public health decisions are made. The team is small, mission-driven, and building systems that turn raw healthcare data into actionable intelligence at scale. They are hiring a Data Engineer to design and own the modern data stack that powers product, analytics, and AI initiatives. The Role You will design, build, and own data pipelines across a modern stack including Snowflake, Airflow, DBT, and AWS. This is not a maintenance role. You will lead architectural discussions, influence technical direction, and build scalable data systems from ingestion through transformation and delivery. You will work closely with product managers, data scientists, and engineering leadership to turn complex healthcare data into reliable, production-ready data products. This role requires strong ownership, architectural thinking, and the ability to operate in a fast-moving, collaborative environment. What You’ll Do - Design, build, and maintain data pipelines using Snowflake, Airflow, and DBT - Lead architectural discussions around the modern data stack - Develop scalable ETL and ELT processes using Python and SQL - Build and enforce data quality frameworks across the platform - Integrate internal and external systems via REST APIs - Partner with data science and product teams to prototype and launch data products - Contribute to code reviews and mentor teammates - Leverage AI coding tools while maintaining high quality and security standards What They’re Looking For - 5+ years of software engineering experience - 4+ years in a data engineering role working with modern data stacks - Deep experience with Snowflake, Airflow, DBT, and AWS data services - Strong Python and SQL skills - Experience building scalable ETL/ELT workflows - Experience designing and implementing data quality frameworks - Experience integrating APIs and transforming JSON payloads at scale - Strong communication skills and cross-functional collaboration - Experience using AI coding tools responsibly and effectively Nice to have: - Kafka experience - Experience with Sigma or similar BI tools - Healthcare or public health domain exposure Why This Role Stands Out - Mission-driven healthcare impact - True ownership of the modern data stack - High visibility across product and AI initiatives - Opportunity to influence architecture early - Collaborative, high-caliber engineering team - Rapidly growing, well-funded company If you want to build scalable data systems that directly impact healthcare access and decision-making, this is a high-leverage opportunity.
  • We may use artificial intelligence (AI) tools to support parts of the hiring process, such as reviewing applications, analyzing resumes, or assessing responses. These tools assist our recruitment team but do not replace human judgment. Final hiring decisions are ultimately made by humans. If you would like more information about how your data is processed, please contact us.
  • Apply for this job