




What does the company do? A leading company in Argentina's digital media sector with a long-standing market presence, distinguished by offering high-quality, up-to-date journalistic content through its online platform. What do you need to join the team? On a personal level: Ability for critical thinking and solving complex data-driven problems. Ability to work independently, make technical decisions, and lead projects from start to finish. Clarity and precision in communicating results to Product and Technology teams. On a technical level: At least 2 years of experience in Data Science or Machine Learning. At least 3 years of experience with Python and SQL | Mandatory. Experience with Google Cloud Platform (GCP), including BigQuery, Cloud Storage, and Cloud Functions | Mandatory. Proficiency with notebooks for experimentation and prototyping. Experience in data modeling, definition of productivity metrics, and automation of reporting. Knowledge and/or practical experience in MLOps, monitoring, and deployment of machine learning models. It's a plus if: You have experience with: Vertex AI or MLflow and infrastructure as code (Terraform, Cloud Build, etc.). Knowledge in maintaining production models (monitoring, retraining, drift detection). What will you do? As a Senior Machine Learning Engineer: Design, train, and optimize machine learning models in Jupyter notebook environments. Implement data pipelines and workflows on GCP (BigQuery, Cloud Functions, Workflows, Vertex AI). Develop and maintain data processing and transformation scripts for model training and validation. Ensure data quality, integrity, and traceability through best practices in versioning and documentation. Apply advanced natural language processing (NLP) techniques: embeddings, classification, language modeling, entity extraction. Use machine learning frameworks such as scikit-learn, PyTorch, or TensorFlow as appropriate for the use case. Participate in the complete model lifecycle: from data exploration to deployment and production monitoring. Collaborate with multidisciplinary teams using agile methodologies, proposing technical and process improvements. What is the challenge of the position? The main challenge will be developing scalable machine learning solutions and data pipelines on Google Cloud Platform (GCP). This role combines experimentation, engineering, and production deployment, making it a key part of our data team. When and where will you work? You will work in a 100% virtual mode, exclusively for residents in AMBA, Buenos Aires (Argentina). Monday to Friday, from 09:00 to 18:00. Who will you work with? You will work with the Data Science team and will crucially collaborate with Product and Technology teams, serving as a key bridge for communicating results and implementing technical solutions. What tools will you work with? Python, SQL, Google Cloud Platform (GCP), BigQuery, Dataflow, and Vertex AI. In addition to tools and frameworks for MLOps, monitoring, and model deployment. What do they offer? Direct employment relationship. Home Office. Discounts/benefits platform. Health insurance for employee and family. Career growth opportunities. What stages does the selection process consist of? You will have an initial interview with our Recruiter, Lorena Dos Santos Moraes, to discuss your professional experiences and interests. Then, with the hiring company, you will have two interviews with Human Resources and the technical team.


