About Latitude
Latitude Technolabs is a digital solutions company based in Ahmedabad, India, offering a range of services including mobile and web development, ERP solutions, UI/UX design, and digital marketing. The company has a global presence with offices in the USA, Switzerland, and Australia, and has completed over 500 web applications and 100 mobile apps across various industries.
Job Role:
We are hiring an experienced Data Scientist with expertise in AI/ML,LLMs, and Generative AI, who can design and deploy intelligent systemsusing both traditional and modern AI techniques. The ideal candidate willbe able to communicate effectively with clients and stakeholders,understand business needs, and deliver scalable AI solutions.
Role + Responsibilities:
- Design and implement machine learning models using supervised (e.g.,regression, classification) and unsupervised (e.g., clustering,dimensionality reduction) techniques.
- Build and fine-tune LLMs (e.g., GPT, LLaMA, Mistral, Falcon,Qwen) for domain-specific applications.
- Develop and deploy Generative AI use cases such as textsummarization, Q&A systems, chatbots, content generation, etc.
- Work with Agentic AI frameworks like LangChain, LlamaIndex, andbuild RAG pipelines integrated with vector stores (FAISS,Pinecone, Chroma).
- Preprocess and analyze large datasets using feature engineeringand statistical methods.
- Implement model training, validation, testing, and performancetuning.
- Communicate clearly with clients to understand requirements,explain models, and present results.
- Collaborate with cross-functional teams including engineering andproduct to productionize models.Monitor and retrain models basedon feedback or data drift
Person Specification and Qualifications:
- MCA or equivalent degree in Computer Science, Data Science, or related field.
- 3+ years of hands-on experience in data science and machine learning.
- Strong understanding of supervised learning algorithms (Logistic Regression, Random Forest, XGBoost, SVM, etc.) and unsupervised methods (KMeans, DBSCAN, PCA, etc.).
- Proficiency in Python and libraries such as pandas, NumPy, scikit-learn, matplotlib, etc.
- Experience with deep learning frameworks like PyTorch or TensorFlow.
- Familiarity with LLMs and libraries like Hugging Face Transformers.
- Solid grounding in NLP, text embeddings, and RAG architecture.
- Exposure to LangChain, LlamaIndex, or agentic AI frameworks.
- Working knowledge of vector databases (FAISS, Chroma, Pinecone).
- Strong communication skills and comfort in interacting with clients/stakeholders.
- Ability to work independently and collaboratively in a fast-paced environment
Plus Points (Nice-to-Have)
- Hands-on experience with cloud deployment across major platforms like AWS (including SageMaker), Google Cloud Platform (GCP), and Microsoft Azure.
- Familiarity with model deployment pipelines, containerization (Docker), and cloud-native APIs for scalable machine learning services. Proven client communication skills, including requirement gathering, technical demos, progress reporting, and post-deployment support. Experience collaborating with cross-functional teams, including product managers, data engineers, and different teams