




Summary: Seeking a Data Engineer with multicloud experience to lead complex data initiatives, define architectures, provide technical guidance, and manage the organization’s master data repository. Highlights: 1. Lead complex data initiatives and define architectures 2. Optimize scalable data pipelines using AWS and GCP services 3. Evolve and maintain master data repository, ensuring quality and governance We are looking for a Data Engineer with solid experience in multicloud environments, capable of leading complex data initiatives, defining architectures, and providing technical guidance to the team. This role will be responsible for the evolution and maintenance of the organization’s master data repository, ensuring quality, consistency, and governance. Advanced English is required, as the candidate will participate in meetings, documentation, and collaborative work with global teams. Responsibilities Optimize scalable data pipelines using AWS Glue, Lambda, API Gateway, DynamoDB, and GCP services (BigQuery, Cloud Storage, Cloud Functions). Lead data integration and transformation processes using SSIS, Python, and ETL/ELT frameworks. Administer and optimize SQL Server and PostgreSQL databases, performing advanced tuning. Manage data\-oriented APIs and services. Implement advanced monitoring using Datadog, Splunk, and native GCP tools. Manage and automate deployments using Azure DevOps. 1 \- Microsoft SQL Server 2 \- Application Remediation 3 \- IT Troubleshooting 4 \- Engineering Support 5\- Advanced English (C1 \- Expert)


