Experience Level :

Location :

Apache Airflow Architect

Key Job Details

Role :

Location :

Level :

Employment type : Full Time

About the job

Join AiRo Digital Labs and build your career with a leader in emerging digital technologies such as robotic process automation, conversational AI, machine learning, the internet of things, voice-based technologies, and cloud enablement. At AiRo, we offer you competitive benefits and compensation packages along with the opportunity to learn on the job, develop knowledge of the process, and grow your career. What’s more, you will have fun as you solve some of the most complex business problems.


Responsibilities:

  • Design and develop DAGs: Create and maintain DAGs to orchestrate workflows, including data loading, transformation, and reporting.
  • Implement ETL jobs: Develop and deploy ETL jobs for data extraction, transformation, and loading.
  • Manage and configure workflows: Create, manage, and configure data pipelines, ensuring their efficiency and reliability.
  • Ensure data quality: Implement data validation processes to maintain data integrity.
  • Write custom operators, sensors, and hooks: Develop custom components for Airflow to extend its functionality and integrate with specific tools and systems.
  • Collaborate with teams: Work with other data engineers and stakeholders to understand business needs and translate them into Airflow workflows.
  • Contribute to a GitHub-driven environment: Follow coding standards and contribute to a collaborative development process.
  • Write unit and end-to-end tests: Ensure the reliability and quality of Airflow pipelines.
  • Work with data sources and storage: Integrate Airflow with various databases, cloud services, and data lakes.

Years of Experience

  • 9+ Years.


Skills Required:

  • Proficiency in Python: Strong programming skills in Python, the language used for Airflow.
  • Understanding of Airflow architecture and concepts: Knowledge of Airflow’s components, DAGs, operators, and scheduling mechanisms.
  • Experience with SQL and database design: Familiarity with SQL for data manipulation and database design for managing data pipelines.
  • Experience with ETL processes: Understanding of ETL principles and best practices.
  • Knowledge of data warehousing and data lakes: Familiarity with data warehousing concepts and data lake technologies.
  • Familiarity with version control systems: Experience with Git and other version control systems for code management.
Key Job Details

Role :

Location :

Level :

Employment type : Full Time

About the job

Join AiRo Digital Labs and build your career with a leader in emerging digital technologies such as robotic process automation, conversational AI, machine learning, the internet of things, voice-based technologies, and cloud enablement. At AiRo, we offer you competitive benefits and compensation packages along with the opportunity to learn on the job, develop knowledge of the process, and grow your career. What’s more, you will have fun as you solve some of the most complex business problems.


Responsibilities:

  • Design and develop DAGs: Create and maintain DAGs to orchestrate workflows, including data loading, transformation, and reporting.
  • Implement ETL jobs: Develop and deploy ETL jobs for data extraction, transformation, and loading.
  • Manage and configure workflows: Create, manage, and configure data pipelines, ensuring their efficiency and reliability.
  • Ensure data quality: Implement data validation processes to maintain data integrity.
  • Write custom operators, sensors, and hooks: Develop custom components for Airflow to extend its functionality and integrate with specific tools and systems.
  • Collaborate with teams: Work with other data engineers and stakeholders to understand business needs and translate them into Airflow workflows.
  • Contribute to a GitHub-driven environment: Follow coding standards and contribute to a collaborative development process.
  • Write unit and end-to-end tests: Ensure the reliability and quality of Airflow pipelines.
  • Work with data sources and storage: Integrate Airflow with various databases, cloud services, and data lakes.

Years of Experience

  • 9+ Years.


Skills Required:

  • Proficiency in Python: Strong programming skills in Python, the language used for Airflow.
  • Understanding of Airflow architecture and concepts: Knowledge of Airflow’s components, DAGs, operators, and scheduling mechanisms.
  • Experience with SQL and database design: Familiarity with SQL for data manipulation and database design for managing data pipelines.
  • Experience with ETL processes: Understanding of ETL principles and best practices.
  • Knowledge of data warehousing and data lakes: Familiarity with data warehousing concepts and data lake technologies.
  • Familiarity with version control systems: Experience with Git and other version control systems for code management.

Apply for the Apache Airflow Architect role

    Apply for the Apache Airflow Architect role