April 2018 - Present
Client - Trizetto
Roles and Responsibility: * Data Science proficient with 1+ years of involvement in conveying end to end information science ventures utilizing Python and Tableau. * Capability in measurements, machine learning and making convincing reations. * Worked in multiple parts for Retail, Banking and Insurance for better improvement. * Business understanding, Data understanding, Data planning, Modeling, Evaluation and Deployment. * Experience on Data wrangling, Data preparation and Data Visualization R and Tableau. * Significant expository and critical thinking aptitudes alongside capacity to comprehend current business forms and execute successful solution. * Worked with different business units within sales and marketing to understand information needs and provide effective data management and analysis solutions to analyze business environment. * Turned out to be interested to mine shrouded bits of knowledge and complex information examination to non-systematic group of onlookers. * Implemented surveys, interviews and expert meetings to identify client demand and key challenges. Analytical Skill set: * Confusion matrix, Arima models, Reliability models, Stochastic models, Bayesian models, Classification model, Cluster analysis, Anomaly detection, non-parametric methods and multivariate statistics Technical Skills * Skills: Data Visualization, Data Manipulation, Statistical Analysis and Machine Learning * Tools: Tableau * Programming Languages: python Classifying Customers: Classifying the customers based on location, type of compliant, severity of the compliant with automation based on SVM * Collected 100000 Customers Data, we are having 10 Different Customer Types. * To make automated, we implemented SVM with Radial based Kernel * Trained on 80,000 Rows on 12 different features and tested on 20,000 rows. We got an accuracy of 80% when we used Linear Kernel * To improve the accuracy of the model, we used radial based kernel and achieved an accuracy of 91.53% Failure of Encoder Component: * Before a scanner going to fail we need to predict the survival gap of the scanner. To model this problem we collected 1lakh rows to understand failure rate of each and every individual scanner. * We modeled Failure rate of the scanner based on other parameters which are right indicator to predict the survival gap. * We had various discussions with the concerned team related to scanners and extracted best features to build this model. * We future engineered and pick best features based on decision tree. * Build this model on top Random Forest Algorithm with an accuracy 98%.