




Summary: Seeking a Chief Data Integration Engineer to lead the architecture, evolution, and support of an Enterprise Data platform, enabling top-tier data products and improving existing capabilities. Highlights: 1. Lead architecture and implementation of ETL/ELT workflows 2. Collaborate with international teams across time zones 3. Apply Cognite Data Fusion, Azure, and client systems We are searching for a **Chief Data Integration Engineer** to lead the architecture, evolution, and support of the Guyana Enterprise Data platform, enabling top\-tier data products and improving existing capabilities. You will collaborate with international teams across time zones, applying Cognite Data Fusion, Azure, and client systems to deliver resilient, high\-performance data solutions. If this aligns with your vision for advanced data engineering and integration, we encourage your application. **Responsibilities** * Direct and oversee architecture and implementation of ETL and ELT workflows on the enterprise data platform * Refine and manage query performance in Cognite Data Fusion leveraging SQL and GraphQL * Institute secure coding practices and apply timely security patches in cloud\-based systems * Coordinate with global teams in multiple regions to ensure timely and accurate data delivery * Administer and tune queries in Snowflake, PostgreSQL, and other relational database environments * Design, launch, and sustain applications within Azure and OpenShift ecosystems * Automate extraction and transformation pipelines using Python, Scala, Databricks, and Azure Data Factory * Integrate REST and GraphQL services with backend databases and operational applications * Handle ingestion, sanitization, and augmentation of structured and unstructured datasets * Construct and manage complex aggregate queries spanning diverse schemas * Supervise and resolve pipeline performance issues to guarantee consistent data flow * Assess and embed new tools and technologies to advance platform functionality * Ensure data models comply with standards such as OSDU and ISO 14224 * Engage with clients to validate that solutions address organizational objectives * Record designs and workflows in Azure DevOps in line with Agile principles **Requirements** * 7\+ years of experience in the development of data integration and ETL/ELT solutions, with relevant leadership record * Proven capability in guiding teams through complex data engineering initiatives * Hands\-on background in Agile methodologies and collaborative project delivery * Advanced proficiency in Python, Scala, and Java for REST API and backend services * Expert knowledge of SQL with experience in Snowflake, PostgreSQL, and database administration * Deep understanding of object\-oriented programming and design patterns * Familiarity with both relational and NoSQL database systems * Strong data modeling expertise meeting industrial standards * Capability to process unstructured data stored in cloud environments * Experience with CI/CD approaches to deploying data pipelines * Exceptional troubleshooting skills for intricate data process issues * Outstanding communication ability for cross\-cultural teamwork * Upper\-Intermediate (B2\) English proficiency for professional dialogue **Nice to have** * Advanced skills with Apache Spark * Ability to produce clear architectural diagrams * Proficiency with Azure Data Factory and Databricks in enterprise contexts * Knowledge of Azure Data Share capabilities * Experience integrating with Cognite Data Fusion * Ability to leverage Microsoft Power BI for data visualization


