“Apache Airflow and SQL Data Engineer Technology Lead – US”

Full time @Infosys in Information Technology (IT)
  • Saint Louis, MO View on Map
  • Post Date : April 10, 2025
  • Apply Before : April 24, 2025
  • 0 Application(s)
  • View(s) 6
Email Job

Job Detail

  • Job ID 9211
  • Experience  Less Than 1 Year
  • Qualifications  Degree Bachelor
Bottom Promo

Job Description

“Infosys is seeking Data Engineers with Apache Airflow and SQL experience. In this role, you will work directly with the project leaders to develop Data Orchestration and ETL Pipelines using Apache Airflow, implement workflow automation, and contribute to ETL initiatives. You will also be responsible for SQL-based data processing and transformations while ensuring seamless integration with on-prem database environments. Python will be required for scripting and custom Airflow development. You will be part of an entrepreneurship and learning culture, where teamwork and collaboration are encouraged, excellence is rewarded, and diversity is respected and valued.
Required Qualifications:

Bachelor’s degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
All applicants authorized to work in the United States are encouraged to apply.
Candidate must live within commuting distance of Denver, CO, St. Louis, MO or be willing to relocate. Travel within the US may be required.
At least 4 years of experience in Information Technology
At least 3 years of project experience in Apache Airflow, including DAG development.
At least 3 years of project experience in Python and SQL.
Python experience for Airflow scripting, data transformation, and automation.
Good understanding of ETL concepts and data pipeline automation.
Preferred Qualifications:
Experience with Cloud Platforms (AWS or Azure).
Experience in building custom Airflow operators and plugins for workflow automation.
Experience working with SQL and PL/SQL Procedures for data transformation, optimization, and querying.
Experience in Big Data ecosystem using Hadoop/Scala/Spark/PySpark.
Experience in data warehouse using any ETL tool.
Experience in Unix shell scripting.
Preferred – Experience in Telecom Domain
The job may entail extensive travel. The job may also entail sitting as well as working at a computer for extended periods of time. Candidates should be able to effectively communicate by telephone, email, and face to face.
Estimated annual compensation range for candidate based in the below locations will be”

Bottom Promo

Required skills

Other jobs you may like