With a dynamic 14-year journey in the field of technology, to design and deliver FCCM, data science, machine learning solutions and developing GenAI applications for various business scenarios, including 6+ years of valuable Canadian work experience working with clients like Scotia and TD Bank.
I am a seasoned professional specializing in DataScience, Machine Learning, Generative AI, LLM - AI Agents and Chatbots.
I navigate the realms of Cloud Native and MLOps/LLMOps, ensuring optimal tech utilization. My collaborative approach extends to working closely with customers, sales, and pre-sales stakeholders. I play a pivotal role in envisioning effective solutions for business scenarios and aligning them with product roadmaps and strategies.
I am always open to enthusiastic collaborations and contributions to interesting projects. Let's connect and explore the possibilities of how we can learn and grow together in the field of Data Science, ML and AI.
0 + Projects completed
With a dynamic 14-year journey in the field of technology, I am a seasoned professional specializing in Generative AI, LLM - AI Agents and Chatbots. As a Senior Principal Data Scientist at Oracle, my expertise lies in Generative AI, Retrieval Augmented Generation, Document Summarization/Understanding and Data Science. Operating primarily on the Oracle Cloud Infrastructure, I navigate the realms of Cloud Native and MLOps/LLMOps, ensuring optimal tech utilization. My collaborative approach extends to working closely with customers, sales, and pre-sales stakeholders. I play a pivotal role in envisioning effective solutions for business scenarios and aligning them with product roadmaps and strategies. Engaging with Product Management, Marketing, and Development teams, I contribute valuable insights to enhance Oracle Products offerings.
Led the development and implementation of advanced Generative AI and NLP solutions, leveraging Large
Language Models (LLMs) to deliver innovative services to clients. Expertly researched and applied state-of-
the-art techniques, demonstrating initiative and self-sufficiency.
Spearheaded the full lifecycle of LLM solutions, encompassing Data Engineering and Prompt Engineering, to
optimize user experiences. This included the design, tuning, deployment, and ongoing monitoring of solutions
using cutting-edge frameworks such as RAG, underpinned by robust MLOps practices.
Working on extracting and analyzing data from various types of documents, including structured forms, semi-
structured documents, and unstructured text. E.g. Invoices, form/reports etc.
Actively contributed to high-impact projects by building, scaling, and deploying machine learning models in
cloud and on-premises environments, ensuring scalability and effective user adoption.
Building Machine models for FCCM find out the clients who have been into money laundering.
Build models for Threshold Optimizations for transaction monitoring.
Developing K-Means Clustering models for Customer segmentation and behavioral Pattern detection.
Proficient in OCI, Azure, AWS; AWS SageMaker and Bedrock.
Extensive hands-on experience with GPT-4.0, LLaMa, Mistral, RAG patterns, LangGraph, AI Agents, FastAPI
Big Data Analytical Tools engineering - install, validate, test, and package analytical tools on Red Hat Linux and
Wintel platforms.
Publish and enforce Big Data Analytics Tools best practices, configuration recommendations, etc.
Create and publish design documents, usage patterns, and cookbooks for user community.
Partner with Citi sectors to solution analytical needs to large scale data problems.
Perform security and compliance assessment for Big Data Analytical Tools.
Provide SME and Level-3 technical Expertise.
Description:The purpose of the project is to store information generated by the bank’s historical data, extract meaning
information out of it and based on the predict the customer’s category. The solution is based on the open source Big Data
s/w Hadoop .The data will be stored in Hadoop file system and processed using Map/Reduce jobs for product and pricing
information.
Role: Responsible to manage data coming from different sources.Involved in developing the Pig scripts.Handled importing
of data from various data sources, performed transformation using Pig and Hive, loaded into HDFS and extracted the data
from MySQL into HDFS using Sqoop.Developed the Sqoop scripts in order to make the interaction between Pig and MySQL
Database.Written Hive queries for data analysis to meet business requirements. Creating Hive Table and working on them
using HIVE QL.
Description: The purpose of this project is to analyze the sales of the different product experience to find (perform
following analytics) out yearly sales reports, monthly sales for a particular year, quarterly, half yearly sales report for a
particular year, Sales report by fiscal year, for different country, top sales for different country, comparing the quarterly
report for different country, suggest the seller for different specific product and which product are like, dislike by the
customer of different country.
Role: Responsible for extracting data from XML file and load into HDFS by using flume. Write Hive script to generate the desired
report. Import and export the data from different source to different destination. Loaded the dataset into Hive tables and
performed the required data analysis.
Working in a Live Production environment as L2 Engineer, Managing Data Center Activities, Daily Tasks, patching, bouncing
production system.
Remarkable in supporting UNIX production support environment, Expertise in analyzing and resolving production
issues, monitoring/scheduling/start/stop database jobs, submission and troubleshooting of jobs. Modify/create
UNIX shell scripting for automating the back-end processes. Creating, modifying, maintaining and optimizing SQL
server database. Developed SQL scripts and complex queries.
Implemented techniques like backup and restore process for various databases.
Working in a MIS team, responsible for the design, development, and implementation of automated reporting
and efficiencies using MS Access.
Created a MS Access Database to house automated reports and data analysis. Wrote a Macro-enabled workbook
in MS Excel that reduced user error and time spent completing the Repurchase Account notification form by over
50 percent.
Maintained and updated the MS Access and SQL Server Databases by running backups and tuning queries.
Used SQL to query financial data and produce reports for management.
Created VBA macros used to simplify routine tasks for end users
Below are my key projects in Generative AI, AI AGENTS, Machine Learning, NLP, and Data Science, leveraging technologies such as LLMs, SQL, Python, AWS, Tableau, and advanced ML techniques.
Below are the details to reach out to me!