Benchmark Python programs with previous programs I wrote in R language (machine learning and deep learning); practical advanced analytics with Python (Scikit-learn, Pandas, numpy, PyCharm); Linear Regression, Logistic Regression, Linear Discriminant Analysis, Classification and Regression Trees, Naive Bayes, K-Nearest Neighbors, Learning Vector Quantization, Support Vector Machines, Tree-Based Methods (Basics, Bagging, Random Forests, Boosting), Resampling Methods (cross-validation,, bootstrap), Non-linear Modeling (Polynomial Regression and Step Functions, Splines, Generalized Additive Models), Unsupervised Learning (K-Means Clustering, Principal Components Analysis); how to approach Linear Model Selection and Regularization (Ridge Regression and the Lasso)
Models assessment (mainly bias-variance trade-off); use peculiar techniques adapted for specific types of problems –classification, regression- (Confidence Interval, Confusion Matrix, Gain and Lift Chart, Kolmogorov-Smirnov Chart, Chi Square, ROC curve, Gini Coefficient, Root Mean Square Error, L1 version of RSME, Cross Validation, Predictive Power)
Deep learning (Neural Networks mainly for time series forecast); Multilayer Perceptron, Convolutional Neural Network (CNN), Long Short-Term Memory Neural Network (LSTM), hybrid CNN-LSTM; modeling with TensorFlow
Practice to improve performance; data: clean, resample, transform, rescale; algorithms: evaluation metric, linear versus non-linear; tuning: optimize parameters, random search, grid search; ensembles: blend model predictions; tasks as regards scalability (storage resources, high availability, distributed systems, fault tolerance, synchronous replication, Hadoop), security (authentication, authorization, audit, user permissions, data lineage)
Construct a MIM Knowledge Base; build an incident classification relying on IT architectural classes (decision tree); apply also for DevOps; derive conclusions for reducing outage time (achievement in hypervisor area- DevOps like approach); use assigned monitoring tools (Jira equivalent)
Investigate according with DevOps techniques (Nagios and Splunk for data collection, XML and JSON files)
Optimize resolution time by applying Automated Guiding Procedures (resolution trees) with probabilities computed for IT architectural classes (optimally searching solutions + new procedures and standard processes)
Monitor performance, Data analysis and reports to management
Collaborate with AWS (Amazon Web Services) for services there located; comparison with AWS analytics
Use CRAN, tidyverse, dplyr
Technical preparation of data models; shell scripting; apply appropriate numerical methods (e.g. segmentation)
Benchmark Natural Language Processing solutions
Use BI tools and visualization (e.g. R Shiny, plotly, Tableau, QlikView)
Use Visual Analytics for Data Discovery
Planning and executing data conversion activities - ETL (Extraction, Transformation, Loading) in jobs (relying on Data Integration Studio)
Maintenance for ETL jobs (DIS) for a bank
Demonstration using Visual Analytics, Enterprise Guide, Enterprise Miner
Design a technical solution for using SAS Enterprise Miner in conjunction with a specific Data Mart using SEMMA methodology (Customer Relationship Management - the solutions envisaged Profiling and Segmentation, Campaign Management, Profitability Analysis etc.)
Define choice criteria between SAS SPDS solution (Scalable Performance Data Server) and a classical RDBMS
Enhance a specific Data Mart (Customer Relationship Management) and in conjunction design peculiar data mining applications based on SAS Enterprise Miner; the solutions relied on decision trees, regression (linear, logistic, generalized), time-series (decomposition, forecast, clustering, classification), clustering (k-means, hierarchical), association rules (a priori), principal component analysis, anomaly detection and envisaged:
Profiling and Segmentation
Cross-Sell and Up-Sell
Acquisition and Retention
Campaign Management
Profitability and lifetime value
Market Basket Analysis
Upgrade credit scoring with survival analysis and extend to campaign management
Technical demonstration how to use supportive tools to automatically generate and maintain metadata, to keep inventory of data and jobs, to make impact analysis for data model, to accomplish change tracking and release management, to take advantage of performance statistics (mainly with Data Integration Studio)
Provide technical analysis how to use Data Quality tools
Driving test planning for DW
Define IT Enterprise Architecture management process and approach (in an Erste TOGAF inspired environment and COBIT compliant); describe the enterprise architecture model from multiple dimensions with key features for each of them; use Architecture Development Method (AMD);characterize business architecture, data architecture, application architecture and information architecture; inventory of IT standards. Responsible for delivered architectural documentation
State IT Architecture principles –standards and methods- and subsequent scoring method (with KPIs) for IT projects relying (also agile) on these principles in order to ensure consistent development of systems & solutions; evaluation of projects’ impact
Attend to implementation of governance, risk, compliance applications
Prepare/participate/revise/approve technical architecture of the data warehouse (finally a client data warehouse architecture with dependent data marts and two delivery layers, daily and monthly)
Check the backup/restore/archive and disaster recovery solutions
Select data mining algorithms (data analytics) and solutions for segmentation and quantitative analysis
Outline Data Architecture framework and the way towards Information Architecture
Identify data modeling tools requirements and select metadata tools: Enterprise Architect (from Sparx Systems) and Power Designer (from Sybase - a SAP company)
Join together Cognos BI (Report Studio, Framework Manager) to the Data Warehouse solution; carry out modeling tasks (build a model, add business logic to the model, create and configure a package); prepare reports (by assembling data source, a model, stored procedure)
Work out the IT strategy
Implement applications as requested by business lines
Recruit appropriate IT personnel
Decrease operational costs Work performed (except projects described in next section):
Approved IT strategy
Optimization of information data flow (re-engineering)
Decreased running costs (about 25%)
Lead some research teams for achieving scientific projects related to usage of artificial intelligence knowledge and methodology for decision support (based on original C++ written code):
“Information system built with neural networks for modeling and forecasting with applications in economy and finance”
“Applications of modeling with neural networks”
“Models for structures with neural networks and their simulation”
Take part at the design and construction of a 32 bits-minicomputer, VAX compatible, where I achieved the implementation of floating point instructions; as a result of this activity I hold the Patent RO 98369
Algorithms for pattern recognition (data mining)
Propose and supervise student projects (e.g. pattern classification with a Kohonen network)
Data Scientist
Contract type | Freelancers |
---|---|
Daily rate | 555$/day |
Experience | 19 years |
Location | United Kingdom |
Data Scientist
Contract type | Freelancers |
---|---|
Daily rate | 660$/day |
Experience | 11 years |
Location | Romania |
Data Scientist
Contract type | Freelancers |
---|---|
Daily rate | 920$/day |
Experience | 28 years |
Location | United Kingdom |
Data Scientist
Contract type | Freelancers |
---|---|
Daily rate | 660$/day |
Experience | 8 years |
Location | France |
Data Scientist
Contract type | Freelancers |
---|---|
Daily rate | 660$/day |
Experience | 8 years |
Location | United Kingdom |
Data Scientist
Contract type | Freelancers |
---|---|
Daily rate | 830$/day |
Experience | 24 years |
Location | Switzerland |
Data Scientist
Contract type | Freelancers |
---|---|
Daily rate | 830$/day |
Experience | 9 years |
Location | France |
Data Scientist
Contract type | Freelancers |
---|---|
Daily rate | 370$/day |
Experience | 17 years |
Location | India |
Data Scientist
Contract type | Freelancers |
---|---|
Daily rate | 915$/day |
Experience | 19 years |
Location | France |
Data Scientist
Contract type | Freelancers |
---|---|
Daily rate | 960$/day |
Experience | 11 years |
Location | France |
Data Scientist
Contract type | Freelancers |
---|---|
Daily rate | 415$/day |
Experience | 8 years |
Location | France |
Data Scientist
Contract type | Freelancers |
---|---|
Daily rate | 660$/day |
Experience | 16 years |
Location | France |