




Job Summary: We are seeking a Data Engineer with experience in ETL/ELT processes, Snowflake, Python, SQL, and DataStage to design, develop, and maintain data infrastructures and pipelines, ensuring data quality and traceability. Key Highlights: 1. Promotes cultural diversity and an inclusive environment. 2. Constant opportunities for professional growth and development. 3. Work in the Systems and Technology Department of a leading company. A market-leading healthcare company that promotes cultural diversity and is committed to providing its people with an inclusive and collaborative environment where you can enhance your skills while being yourself, co-creating a space of continuous growth opportunities. We are looking to hire a Data Engineer for the Systems and Technology Department. Candidates must hold a technical degree (e.g., Computer Engineering, Systems Engineering, Computer Science, or equivalent) and possess extensive experience in data processes (ETLs/ELTs), along with expertise in Snowflake, Python, SQL, and DataStage. Key Responsibilities: Design, develop, and maintain data ETL/ELT processes. Test and maintain on-premises and cloud-based (AWS) data infrastructures. Integrate and transform data from multiple internal and external sources. Build robust, scalable, and optimized data models in Snowflake. Develop data pipelines and DAGs in Airflow. Apply best practices in Data Warehousing, DataOps, and data governance. Ensure data quality, reliability, and availability for business units. Document data flows and architectures to guarantee traceability and shared knowledge. Monday to Thursday: 9 AM–6 PM; Friday: 9 AM–5 PM. Requirements: Mandatory Requirements: Proven experience with Snowflake and ETL/ELT processes. Advanced SQL proficiency and knowledge of optimization techniques (performance tuning, execution plan analysis, etc.). Experience using DataStage for data integration and transformation. Python development for data handling and automation. Dimensional modeling (facts and dimensions in star schemas). DAG development in Airflow. Familiarity with code repositories (GitHub/GitLab). High level of autonomy and diagnostic capability for resolving technical incidents. Proactivity in issuing early alerts and documenting workflows in Jira. Excellent communication skills to participate in Code Reviews and collaborate with functional teams. Basic/intermediate knowledge of AI usage for process automation and optimization, prompt engineering, and current AI tools. Benefits Significant development opportunity. ️ Health insurance coverage for you and your family. ️Additional vacation days. Connectivity allowance. Flexible working hours and hybrid work model. Udemy licenses for self-directed learning. Priority access when contracting the company’s products and/or services. Free vaccination campaign. ️️️️ Gym membership coverage through the Wellhub network. Corporate discounts for cinemas, theaters, travel, restaurants, spas, and other categories.
