KubeCraftJobs

DevOps & Cloud Job Board

Data Science with GCP

Tata Consultancy Services

Bengaluru, Hyderabad, Pune

On-site
$0k - $0k
Mid Level
Full Time
Posted January 05, 2026

Tech Stack

google-cloud-platform microsoft-azure amazon-web-services python tensorflow pytorch transformers langchain llamaindex vertex bedrock streamlit fastapi docker kubernetes amazon-sagemaker claude-by-anthropic gemini faiss crewai aws-lambda amazon-ecs amazon-eks amazon-s3 amazon-api-gateway azure-storage github github-actions azure-devops guardrails

Please log in or register to view job application links.

Job Description

**Desired Competencies (Technical/Behavioral Competency)** **Must-Have\*\*** **(Ideally should not be more than 3-5)** Core Expertise: 5+ years in AI/ML, NLP, and model deployment on GCP/Azure/AWS. Strong Python skills; experience with TensorFlow, PyTorch, Transformers, LangChain, LlamaIndex. - GenAI & Advanced AI: Minimum 1 year in Generative AI, RAG, and Agentic AI. Familiarity with LLM fine-tuning, prompt engineering, and multi-agent systems. Cloud Services: Hands-on with Vertex AI, Azure OpenAI, AWS Bedrock, and related services. **Good-to-Have(Ideally should not be more than 3-5)** Experience with Streamlit, FastAPI, Docker/Kubernetes, and MLOps best practices. Strong problem-solving and analytical thinking. Ability to work in cross-functional teams and communicate technical concepts clearly. **Responsibility of / Expectations from the Role** **1** AI/ML Development & Deployment: - Design, build, and deploy ML/NLP models for production environments using GCP AI Platform, Azure ML, or AWS SageMaker. - Optimize models for performance, scalability, and cost efficiency. - Generative AI & Advanced Architectures: - Implement GenAI solutions leveraging LLMs (OpenAI, Claude, Gemini, etc.). - Develop RAG pipelines integrating vector databases (e.g., Pinecone, ChromaDB, FAISS) and document indexing. - Build Agentic AI systems using frameworks like AutoGen, CrewAI, or similar for reasoning, planning, and multi-agent orchestration. - Cloud & Infrastructure: - Utilize cloud-native services for AI workloads: AWS: SageMaker, Lambda, ECS/EKS, Bedrock, S3, API Gateway. Azure: Azure ML, Cognitive Services, OpenAI Service, AKS, Blob Storage. GCP: Vertex AI, AI Hub, BigQuery ML, Cloud Functions, Pub/Sub. - Implement CI/CD pipelines for ML models using GitHub Actions, Azure DevOps, or Cloud Build. - Security & Governance: - Ensure compliance with AI governance, data privacy, and responsible AI principles. - Implement guardrails for safe and ethical AI usage.