




Summary: We are seeking an expert Databricks Engineer with strong Azure ecosystem experience to design, build, and optimize data pipelines and ensure best practices for data quality and maintainability. Highlights: 1. Opportunity to support a high-impact project within a global healthcare leader 2. Focus on designing, building, and optimizing Databricks pipelines in Azure 3. Collaborate with cross-functional teams on delivery-focused projects **About Our Client:** Johnson \& Johnson is one of the world’s most respected and innovative healthcare companies, dedicated to improving the health and well\-being of people everywhere. With a global presence across pharmaceuticals, medical devices, and consumer health, J\&J is known for combining science, technology, and design to deliver impactful solutions. **About the Role:** We’re looking for a Databricks Expert Engineer with strong experience in the Azure ecosystem to support a high\-impact project. This is a contractor role through Workana, starting ASAP, with a 3–4 month initial contract and strong potential for extension. **What You’ll Do:** * Design, build, and optimize Databricks pipelines in an Azure environment * Develop reliable ETL/ELT workflows using PySpark / Spark SQL * Implement and maintain data ingestion from multiple sources (APIs, DBs, cloud storage, etc.) * Improve performance, scalability, and cost\-efficiency of Databricks workloads * Work with stakeholders to translate business needs into technical solutions * Ensure best practices around data quality, monitoring, documentation, and maintainability * Collaborate with cross\-functional teams (data engineering, analytics, product, etc.) **Requirements:** * Proven experience as a Databricks Engineer (expert\-level) * Strong hands\-on work in Azure Databricks * Advanced Spark / PySpark and SQL * Experience with Azure Data Lake / Blob Storage and Azure\-native data workflows * Solid understanding of data engineering fundamentals (batch processing, orchestration, data modeling basics) * Comfortable working independently in fast\-moving, delivery\-focused projects * English communication skills (written \+ verbal) for working with global teams **Nice to Have:** * Experience with Delta Lake and lakehouse patterns * Familiarity with orchestration tools (e.g., ADF, Airflow, etc.) * CI/CD exposure for data pipelines (Azure DevOps, Git\-based workflows) * Experience supporting analytics/reporting or downstream ML use cases **Benefits:** * Fully remote contractor role (LATAM\-friendly) * Opportunity to work with a global healthcare leader * High chance of extension after the initial contract


