Expert, with a minimum of 10 years of experience in data science or a related field, to perform all necessary activities to implement a robust AI modules, pipelines, solution and services for advanced AI technologies
Develop scalable backend infrastructure to support real-time data ingestion, model inference, and user access.
Implement components to consume and process events from streaming output sources in real-time, utilizing event-driven architecture patterns.
Utilize microservices and agentic architecture for scalability and to support the dynamic addition of new features.
Build a scalable real-time data ingestion pipeline that covers data preprocessing, normalization, labelling, transformation, feature engineering, and storage across multiple data streams
Requirements
Bachelor’s in data science (or equivalent). Masters preferred
minimum of 10 years of experience in data science or a related field, to perform all necessary activities to implement a robust AI modules, pipelines, solution and services for advanced AI technologies.